[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 10896 1726882157.61984: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 10896 1726882157.62897: Added group all to inventory 10896 1726882157.62900: Added group ungrouped to inventory 10896 1726882157.62904: Group all now contains ungrouped 10896 1726882157.62907: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 10896 1726882157.91405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 10896 1726882157.91580: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 10896 1726882157.91608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 10896 1726882157.91667: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 10896 1726882157.91857: Loaded config def from plugin (inventory/script) 10896 1726882157.91859: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 10896 1726882157.92006: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 10896 1726882157.92175: Loaded config def from plugin (inventory/yaml) 10896 1726882157.92178: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 10896 1726882157.92508: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 10896 1726882157.93392: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 10896 1726882157.93431: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 10896 1726882157.93435: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 10896 1726882157.93441: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 10896 1726882157.93445: Loading data from /tmp/network-Kc3/inventory.yml 10896 1726882157.93618: /tmp/network-Kc3/inventory.yml was not parsable by auto 10896 1726882157.93759: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 10896 1726882157.93798: Loading data from /tmp/network-Kc3/inventory.yml 10896 1726882157.93998: group all already in inventory 10896 1726882157.94005: set inventory_file for managed_node1 10896 1726882157.94009: set inventory_dir for managed_node1 10896 1726882157.94010: Added host managed_node1 to inventory 10896 1726882157.94012: Added host managed_node1 to group all 10896 1726882157.94013: set ansible_host for managed_node1 10896 1726882157.94014: set ansible_ssh_extra_args for managed_node1 10896 1726882157.94017: set inventory_file for managed_node2 10896 1726882157.94019: set inventory_dir for managed_node2 10896 1726882157.94020: Added host managed_node2 to inventory 10896 1726882157.94022: Added host managed_node2 to group all 10896 1726882157.94022: set ansible_host for managed_node2 10896 1726882157.94023: set ansible_ssh_extra_args for managed_node2 10896 1726882157.94026: set inventory_file for managed_node3 10896 1726882157.94028: set inventory_dir for managed_node3 10896 1726882157.94028: Added host managed_node3 to inventory 10896 1726882157.94030: Added host managed_node3 to group all 10896 1726882157.94030: set ansible_host for managed_node3 10896 1726882157.94031: set ansible_ssh_extra_args for managed_node3 10896 1726882157.94033: Reconcile groups and hosts in inventory. 10896 1726882157.94037: Group ungrouped now contains managed_node1 10896 1726882157.94038: Group ungrouped now contains managed_node2 10896 1726882157.94040: Group ungrouped now contains managed_node3 10896 1726882157.94241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 10896 1726882157.94491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 10896 1726882157.94655: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 10896 1726882157.94683: Loaded config def from plugin (vars/host_group_vars) 10896 1726882157.94686: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 10896 1726882157.94766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 10896 1726882157.94776: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 10896 1726882157.94903: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 10896 1726882157.95644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882157.95854: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 10896 1726882157.95969: Loaded config def from plugin (connection/local) 10896 1726882157.95973: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 10896 1726882157.96691: Loaded config def from plugin (connection/paramiko_ssh) 10896 1726882157.96696: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 10896 1726882157.98376: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10896 1726882157.98532: Loaded config def from plugin (connection/psrp) 10896 1726882157.98535: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 10896 1726882158.00098: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10896 1726882158.00212: Loaded config def from plugin (connection/ssh) 10896 1726882158.00215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 10896 1726882158.04648: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10896 1726882158.04689: Loaded config def from plugin (connection/winrm) 10896 1726882158.04692: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 10896 1726882158.04726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 10896 1726882158.04925: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 10896 1726882158.05003: Loaded config def from plugin (shell/cmd) 10896 1726882158.05006: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 10896 1726882158.05033: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 10896 1726882158.05109: Loaded config def from plugin (shell/powershell) 10896 1726882158.05111: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 10896 1726882158.05164: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 10896 1726882158.05358: Loaded config def from plugin (shell/sh) 10896 1726882158.05360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 10896 1726882158.05392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 10896 1726882158.05505: Loaded config def from plugin (become/runas) 10896 1726882158.05507: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 10896 1726882158.05700: Loaded config def from plugin (become/su) 10896 1726882158.05702: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 10896 1726882158.05873: Loaded config def from plugin (become/sudo) 10896 1726882158.05876: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 10896 1726882158.05912: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 10896 1726882158.06251: in VariableManager get_vars() 10896 1726882158.06279: done with get_vars() 10896 1726882158.06417: trying /usr/local/lib/python3.12/site-packages/ansible/modules 10896 1726882158.12071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 10896 1726882158.12407: in VariableManager get_vars() 10896 1726882158.12412: done with get_vars() 10896 1726882158.12415: variable 'playbook_dir' from source: magic vars 10896 1726882158.12416: variable 'ansible_playbook_python' from source: magic vars 10896 1726882158.12417: variable 'ansible_config_file' from source: magic vars 10896 1726882158.12417: variable 'groups' from source: magic vars 10896 1726882158.12418: variable 'omit' from source: magic vars 10896 1726882158.12419: variable 'ansible_version' from source: magic vars 10896 1726882158.12420: variable 'ansible_check_mode' from source: magic vars 10896 1726882158.12421: variable 'ansible_diff_mode' from source: magic vars 10896 1726882158.12422: variable 'ansible_forks' from source: magic vars 10896 1726882158.12422: variable 'ansible_inventory_sources' from source: magic vars 10896 1726882158.12423: variable 'ansible_skip_tags' from source: magic vars 10896 1726882158.12424: variable 'ansible_limit' from source: magic vars 10896 1726882158.12425: variable 'ansible_run_tags' from source: magic vars 10896 1726882158.12425: variable 'ansible_verbosity' from source: magic vars 10896 1726882158.12467: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 10896 1726882158.14030: in VariableManager get_vars() 10896 1726882158.14047: done with get_vars() 10896 1726882158.14057: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10896 1726882158.15937: in VariableManager get_vars() 10896 1726882158.15951: done with get_vars() 10896 1726882158.15959: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10896 1726882158.16076: in VariableManager get_vars() 10896 1726882158.16105: done with get_vars() 10896 1726882158.16240: in VariableManager get_vars() 10896 1726882158.16253: done with get_vars() 10896 1726882158.16261: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10896 1726882158.16340: in VariableManager get_vars() 10896 1726882158.16355: done with get_vars() 10896 1726882158.16657: in VariableManager get_vars() 10896 1726882158.16669: done with get_vars() 10896 1726882158.16672: variable 'omit' from source: magic vars 10896 1726882158.16688: variable 'omit' from source: magic vars 10896 1726882158.16722: in VariableManager get_vars() 10896 1726882158.16848: done with get_vars() 10896 1726882158.16897: in VariableManager get_vars() 10896 1726882158.16911: done with get_vars() 10896 1726882158.17001: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10896 1726882158.17434: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10896 1726882158.17798: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10896 1726882158.19692: in VariableManager get_vars() 10896 1726882158.19930: done with get_vars() 10896 1726882158.20848: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 10896 1726882158.21220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10896 1726882158.25356: in VariableManager get_vars() 10896 1726882158.25436: done with get_vars() 10896 1726882158.25446: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10896 1726882158.25922: in VariableManager get_vars() 10896 1726882158.25944: done with get_vars() 10896 1726882158.26370: in VariableManager get_vars() 10896 1726882158.26386: done with get_vars() 10896 1726882158.27034: in VariableManager get_vars() 10896 1726882158.27060: done with get_vars() 10896 1726882158.27066: variable 'omit' from source: magic vars 10896 1726882158.27091: variable 'omit' from source: magic vars 10896 1726882158.27131: in VariableManager get_vars() 10896 1726882158.27145: done with get_vars() 10896 1726882158.27172: in VariableManager get_vars() 10896 1726882158.27188: done with get_vars() 10896 1726882158.27226: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10896 1726882158.27538: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10896 1726882158.32451: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10896 1726882158.33295: in VariableManager get_vars() 10896 1726882158.33438: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10896 1726882158.37765: in VariableManager get_vars() 10896 1726882158.37794: done with get_vars() 10896 1726882158.37805: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10896 1726882158.38383: in VariableManager get_vars() 10896 1726882158.38413: done with get_vars() 10896 1726882158.38475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 10896 1726882158.38504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 10896 1726882158.38782: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 10896 1726882158.39235: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 10896 1726882158.39238: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 10896 1726882158.39270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 10896 1726882158.39432: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 10896 1726882158.39614: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 10896 1726882158.39674: Loaded config def from plugin (callback/default) 10896 1726882158.39679: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10896 1726882158.42568: Loaded config def from plugin (callback/junit) 10896 1726882158.42571: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10896 1726882158.42740: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 10896 1726882158.42912: Loaded config def from plugin (callback/minimal) 10896 1726882158.42914: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10896 1726882158.43075: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10896 1726882158.43136: Loaded config def from plugin (callback/tree) 10896 1726882158.43139: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 10896 1726882158.43500: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 10896 1726882158.43503: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 10896 1726882158.43531: in VariableManager get_vars() 10896 1726882158.43545: done with get_vars() 10896 1726882158.43552: in VariableManager get_vars() 10896 1726882158.43562: done with get_vars() 10896 1726882158.43566: variable 'omit' from source: magic vars 10896 1726882158.43684: in VariableManager get_vars() 10896 1726882158.43701: done with get_vars() 10896 1726882158.43732: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 10896 1726882158.45054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 10896 1726882158.45199: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 10896 1726882158.45291: getting the remaining hosts for this loop 10896 1726882158.45295: done getting the remaining hosts for this loop 10896 1726882158.45298: getting the next task for host managed_node2 10896 1726882158.45301: done getting next task for host managed_node2 10896 1726882158.45303: ^ task is: TASK: Gathering Facts 10896 1726882158.45305: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882158.45307: getting variables 10896 1726882158.45307: in VariableManager get_vars() 10896 1726882158.45316: Calling all_inventory to load vars for managed_node2 10896 1726882158.45318: Calling groups_inventory to load vars for managed_node2 10896 1726882158.45320: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882158.45389: Calling all_plugins_play to load vars for managed_node2 10896 1726882158.45405: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882158.45409: Calling groups_plugins_play to load vars for managed_node2 10896 1726882158.45507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882158.45622: done with get_vars() 10896 1726882158.45628: done getting variables 10896 1726882158.45804: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Friday 20 September 2024 21:29:18 -0400 (0:00:00.025) 0:00:00.025 ****** 10896 1726882158.45886: entering _queue_task() for managed_node2/gather_facts 10896 1726882158.45887: Creating lock for gather_facts 10896 1726882158.46565: worker is 1 (out of 1 available) 10896 1726882158.46575: exiting _queue_task() for managed_node2/gather_facts 10896 1726882158.46761: done queuing things up, now waiting for results queue to drain 10896 1726882158.46763: waiting for pending results... 10896 1726882158.47139: running TaskExecutor() for managed_node2/TASK: Gathering Facts 10896 1726882158.47333: in run() - task 12673a56-9f93-8b02-b216-0000000000cd 10896 1726882158.47355: variable 'ansible_search_path' from source: unknown 10896 1726882158.47462: calling self._execute() 10896 1726882158.47673: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882158.47677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882158.47679: variable 'omit' from source: magic vars 10896 1726882158.47935: variable 'omit' from source: magic vars 10896 1726882158.47943: variable 'omit' from source: magic vars 10896 1726882158.48089: variable 'omit' from source: magic vars 10896 1726882158.48189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882158.48443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882158.48450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882158.48590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882158.48595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882158.48613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882158.48621: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882158.48665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882158.48910: Set connection var ansible_connection to ssh 10896 1726882158.48914: Set connection var ansible_timeout to 10 10896 1726882158.48916: Set connection var ansible_shell_type to sh 10896 1726882158.48952: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882158.48962: Set connection var ansible_shell_executable to /bin/sh 10896 1726882158.48995: Set connection var ansible_pipelining to False 10896 1726882158.49043: variable 'ansible_shell_executable' from source: unknown 10896 1726882158.49075: variable 'ansible_connection' from source: unknown 10896 1726882158.49140: variable 'ansible_module_compression' from source: unknown 10896 1726882158.49143: variable 'ansible_shell_type' from source: unknown 10896 1726882158.49146: variable 'ansible_shell_executable' from source: unknown 10896 1726882158.49148: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882158.49150: variable 'ansible_pipelining' from source: unknown 10896 1726882158.49153: variable 'ansible_timeout' from source: unknown 10896 1726882158.49154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882158.49670: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882158.49674: variable 'omit' from source: magic vars 10896 1726882158.49677: starting attempt loop 10896 1726882158.49681: running the handler 10896 1726882158.49898: variable 'ansible_facts' from source: unknown 10896 1726882158.49903: _low_level_execute_command(): starting 10896 1726882158.49905: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882158.51771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882158.51775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882158.51778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882158.51780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882158.52460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882158.52471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882158.52699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882158.52910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882158.54535: stdout chunk (state=3): >>>/root <<< 10896 1726882158.54607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882158.54611: stdout chunk (state=3): >>><<< 10896 1726882158.54626: stderr chunk (state=3): >>><<< 10896 1726882158.54730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882158.54734: _low_level_execute_command(): starting 10896 1726882158.54737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236 `" && echo ansible-tmp-1726882158.546461-10945-226747738681236="` echo /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236 `" ) && sleep 0' 10896 1726882158.55761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882158.56096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882158.56123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882158.56213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882158.58128: stdout chunk (state=3): >>>ansible-tmp-1726882158.546461-10945-226747738681236=/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236 <<< 10896 1726882158.58468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882158.58480: stdout chunk (state=3): >>><<< 10896 1726882158.58499: stderr chunk (state=3): >>><<< 10896 1726882158.58523: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882158.546461-10945-226747738681236=/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882158.58562: variable 'ansible_module_compression' from source: unknown 10896 1726882158.58801: ANSIBALLZ: Using generic lock for ansible.legacy.setup 10896 1726882158.58804: ANSIBALLZ: Acquiring lock 10896 1726882158.58806: ANSIBALLZ: Lock acquired: 139646160836496 10896 1726882158.58809: ANSIBALLZ: Creating module 10896 1726882159.30548: ANSIBALLZ: Writing module into payload 10896 1726882159.31101: ANSIBALLZ: Writing module 10896 1726882159.31382: ANSIBALLZ: Renaming module 10896 1726882159.31385: ANSIBALLZ: Done creating module 10896 1726882159.31387: variable 'ansible_facts' from source: unknown 10896 1726882159.31390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882159.31402: _low_level_execute_command(): starting 10896 1726882159.31405: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 10896 1726882159.33014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.33169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882159.33261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882159.33337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882159.35024: stdout chunk (state=3): >>>PLATFORM <<< 10896 1726882159.35100: stdout chunk (state=3): >>>Linux <<< 10896 1726882159.35130: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 10896 1726882159.35305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882159.35315: stdout chunk (state=3): >>><<< 10896 1726882159.35325: stderr chunk (state=3): >>><<< 10896 1726882159.35346: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882159.35459 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 10896 1726882159.35569: _low_level_execute_command(): starting 10896 1726882159.35630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 10896 1726882159.35855: Sending initial data 10896 1726882159.35864: Sent initial data (1181 bytes) 10896 1726882159.37106: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882159.37110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882159.37112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.37411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882159.37423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.37505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882159.37617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882159.37728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882159.41163: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 10896 1726882159.41515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882159.41640: stderr chunk (state=3): >>><<< 10896 1726882159.41644: stdout chunk (state=3): >>><<< 10896 1726882159.41646: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882159.41761: variable 'ansible_facts' from source: unknown 10896 1726882159.41764: variable 'ansible_facts' from source: unknown 10896 1726882159.41864: variable 'ansible_module_compression' from source: unknown 10896 1726882159.42089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10896 1726882159.42092: variable 'ansible_facts' from source: unknown 10896 1726882159.42437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py 10896 1726882159.42853: Sending initial data 10896 1726882159.43010: Sent initial data (153 bytes) 10896 1726882159.44192: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882159.44313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882159.44327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.44568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882159.44626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882159.46208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882159.46290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882159.46422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpx34oa7w7 /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py <<< 10896 1726882159.46444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py" <<< 10896 1726882159.46564: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpx34oa7w7" to remote "/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py" <<< 10896 1726882159.50099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882159.50152: stdout chunk (state=3): >>><<< 10896 1726882159.50156: stderr chunk (state=3): >>><<< 10896 1726882159.50158: done transferring module to remote 10896 1726882159.50212: _low_level_execute_command(): starting 10896 1726882159.50302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/ /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py && sleep 0' 10896 1726882159.51780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.51825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882159.51861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882159.51920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882159.51980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882159.53799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882159.53833: stdout chunk (state=3): >>><<< 10896 1726882159.53838: stderr chunk (state=3): >>><<< 10896 1726882159.53841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882159.53843: _low_level_execute_command(): starting 10896 1726882159.53845: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/AnsiballZ_setup.py && sleep 0' 10896 1726882159.55006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882159.55049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.55062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882159.55101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882159.55115: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882159.55181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882159.55229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882159.55375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882159.57579: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10896 1726882159.57584: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 10896 1726882159.57607: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 10896 1726882159.57628: stdout chunk (state=3): >>>import 'posix' # <<< 10896 1726882159.57703: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10896 1726882159.57706: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 10896 1726882159.57758: stdout chunk (state=3): >>># installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 10896 1726882159.57801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 10896 1726882159.57847: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10896 1726882159.57897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3c184d0> <<< 10896 1726882159.58008: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3be7b30> <<< 10896 1726882159.58011: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3c1aa50> import '_signal' # <<< 10896 1726882159.58013: stdout chunk (state=3): >>>import '_abc' # <<< 10896 1726882159.58015: stdout chunk (state=3): >>>import 'abc' # <<< 10896 1726882159.58117: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 10896 1726882159.58235: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 10896 1726882159.58239: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10896 1726882159.58241: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10896 1726882159.58338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.58344: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a2dfa0> <<< 10896 1726882159.58361: stdout chunk (state=3): >>>import 'site' # <<< 10896 1726882159.58407: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10896 1726882159.58774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10896 1726882159.58882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 10896 1726882159.58888: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.58890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10896 1726882159.58897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10896 1726882159.58974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a6bdd0> <<< 10896 1726882159.59015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a6bfe0> <<< 10896 1726882159.59089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10896 1726882159.59111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.59142: stdout chunk (state=3): >>>import 'itertools' # <<< 10896 1726882159.59161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3aa37a0> <<< 10896 1726882159.59474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 10896 1726882159.59481: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3aa3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a83aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a811c0> <<< 10896 1726882159.59484: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a68f80> <<< 10896 1726882159.59486: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10896 1726882159.59488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10896 1726882159.59527: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10896 1726882159.59550: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac3710> <<< 10896 1726882159.59572: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac2330> <<< 10896 1726882159.59605: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac0b90> <<< 10896 1726882159.59669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af8740> <<< 10896 1726882159.59687: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a68200> <<< 10896 1726882159.59736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10896 1726882159.59855: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3af8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af8aa0> <<< 10896 1726882159.59887: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3af8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a66d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10896 1726882159.59903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af9250> <<< 10896 1726882159.59927: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 10896 1726882159.59959: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afa480> <<< 10896 1726882159.60083: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10896 1726882159.60096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b10680> import 'errno' # <<< 10896 1726882159.60125: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.60165: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b11d60> <<< 10896 1726882159.60180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10896 1726882159.60324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b12c00> <<< 10896 1726882159.60327: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b12150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b13ce0> <<< 10896 1726882159.60349: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b13410> <<< 10896 1726882159.60384: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afa4b0> <<< 10896 1726882159.60526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10896 1726882159.60531: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3857bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10896 1726882159.60576: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae38806e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3880440> <<< 10896 1726882159.60604: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3880710> <<< 10896 1726882159.60620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10896 1726882159.60687: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.60828: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3881040> <<< 10896 1726882159.60997: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3881a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38808f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3855d60> <<< 10896 1726882159.61002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 10896 1726882159.61069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3882de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3881b50> <<< 10896 1726882159.61100: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afaba0> <<< 10896 1726882159.61154: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10896 1726882159.61178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.61285: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10896 1726882159.61315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38af140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10896 1726882159.61334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10896 1726882159.61355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10896 1726882159.61390: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38cf500> <<< 10896 1726882159.61414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10896 1726882159.61625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae39302c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10896 1726882159.61647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10896 1726882159.61728: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3932a20> <<< 10896 1726882159.61806: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae39303e0> <<< 10896 1726882159.61836: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38f92e0> <<< 10896 1726882159.61859: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 10896 1726882159.61888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37393d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38ce300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3883d10> <<< 10896 1726882159.62153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10896 1726882159.62173: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faae38ce900> <<< 10896 1726882159.62342: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xc47bg__/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 10896 1726882159.62457: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.62486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 10896 1726882159.62503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10896 1726882159.62536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10896 1726882159.62613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10896 1726882159.62645: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379f0e0> import '_typing' # <<< 10896 1726882159.62832: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae377dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae377d160> <<< 10896 1726882159.62918: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.62922: stdout chunk (state=3): >>>import 'ansible' # <<< 10896 1726882159.62943: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 10896 1726882159.64322: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.65484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379cfb0> <<< 10896 1726882159.65735: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.65742: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37ce960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce750> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cf680> <<< 10896 1726882159.65761: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.65784: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10896 1726882159.65821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10896 1726882159.65843: stdout chunk (state=3): >>>import '_locale' # <<< 10896 1726882159.65890: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37cfe00> import 'pwd' # <<< 10896 1726882159.65908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10896 1726882159.65935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10896 1726882159.65963: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae312dbe0> <<< 10896 1726882159.66004: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae312f800> <<< 10896 1726882159.66032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10896 1726882159.66044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10896 1726882159.66078: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3130200> <<< 10896 1726882159.66105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10896 1726882159.66131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31313a0> <<< 10896 1726882159.66158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10896 1726882159.66187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10896 1726882159.66215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10896 1726882159.66266: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3133e60> <<< 10896 1726882159.66400: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cfd10> <<< 10896 1726882159.66412: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3132030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10896 1726882159.66512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10896 1726882159.66539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313bcb0> <<< 10896 1726882159.66556: stdout chunk (state=3): >>>import '_tokenize' # <<< 10896 1726882159.66623: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313a780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313a4e0> <<< 10896 1726882159.66649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10896 1726882159.66722: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313aa50> <<< 10896 1726882159.66846: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31325a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae317ffb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3180140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10896 1726882159.66871: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10896 1726882159.66906: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.66933: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3181bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3181970> <<< 10896 1726882159.66958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10896 1726882159.67048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10896 1726882159.67075: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3184170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31822a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.67109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 10896 1726882159.67121: stdout chunk (state=3): >>>import '_string' # <<< 10896 1726882159.67161: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3187950> <<< 10896 1726882159.67296: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3184320> <<< 10896 1726882159.67333: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188710> <<< 10896 1726882159.67368: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188a70> <<< 10896 1726882159.67414: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188b00> <<< 10896 1726882159.67429: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3180320> <<< 10896 1726882159.67459: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10896 1726882159.67513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.67585: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30142c0> <<< 10896 1726882159.67685: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30155b0> <<< 10896 1726882159.67822: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae318aa50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae318be00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae318a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 10896 1726882159.67864: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.67952: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.68043: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 10896 1726882159.68133: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.68252: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.68812: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.69309: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10896 1726882159.69336: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 10896 1726882159.69362: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.69451: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30196d0> <<< 10896 1726882159.69582: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10896 1726882159.69602: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301a510> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3015850> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.69623: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10896 1726882159.69760: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.69919: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10896 1726882159.69942: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301a1e0> # zipimport: zlib available <<< 10896 1726882159.70390: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.70833: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.70903: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71137: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 10896 1726882159.71141: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71228: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10896 1726882159.71255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 10896 1726882159.71267: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71308: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71335: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10896 1726882159.71359: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71689: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.71809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10896 1726882159.71854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10896 1726882159.72008: stdout chunk (state=3): >>>import '_ast' # <<< 10896 1726882159.72033: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301b560> # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.72099: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10896 1726882159.72122: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 10896 1726882159.72169: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72220: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 10896 1726882159.72259: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72306: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72356: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72425: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10896 1726882159.72570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3026000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3023d40> <<< 10896 1726882159.72602: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10896 1726882159.72619: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72677: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72732: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72779: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.72805: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 10896 1726882159.72890: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10896 1726882159.72926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10896 1726882159.72954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10896 1726882159.72965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10896 1726882159.73017: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae310e990> <<< 10896 1726882159.73105: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31fe660> <<< 10896 1726882159.73140: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30260c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301be30> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10896 1726882159.73158: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73179: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73214: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10896 1726882159.73288: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10896 1726882159.73398: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73401: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 10896 1726882159.73428: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73446: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.73479: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73513: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73558: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73589: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 10896 1726882159.73744: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73776: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.73803: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.73830: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 10896 1726882159.73851: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.74014: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.74197: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.74225: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.74280: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882159.74333: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 10896 1726882159.74424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10896 1726882159.74447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30ba2d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10896 1726882159.74472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10896 1726882159.74502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10896 1726882159.74525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c83f80> <<< 10896 1726882159.74684: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c882f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30aad80> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30bae40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b8980> <<< 10896 1726882159.74688: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b8d70> <<< 10896 1726882159.74862: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10896 1726882159.74881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c8b2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8ab70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c8ad50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c89fa0> <<< 10896 1726882159.74906: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10896 1726882159.74991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10896 1726882159.75249: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8b380> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10896 1726882159.75253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2cedeb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8be90> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b9400> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 10896 1726882159.75256: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 10896 1726882159.75328: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75379: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 10896 1726882159.75451: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 10896 1726882159.75473: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75560: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.75643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 10896 1726882159.75684: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 10896 1726882159.75745: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75908: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.75911: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.75975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 10896 1726882159.75992: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.76457: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.76926: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 10896 1726882159.76929: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.76987: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77012: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77062: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10896 1726882159.77078: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77180: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.77250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10896 1726882159.77271: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77321: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10896 1726882159.77369: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77380: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77516: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 10896 1726882159.77531: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77551: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10896 1726882159.77613: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2cee6f0> <<< 10896 1726882159.77637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10896 1726882159.77769: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2cee900> import 'ansible.module_utils.facts.system.local' # <<< 10896 1726882159.77772: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77836: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10896 1726882159.77914: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.77997: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.78091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 10896 1726882159.78185: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.78344: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 10896 1726882159.78404: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10896 1726882159.78828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.78831: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2d2a030> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2d19eb0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 10896 1726882159.78886: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.78971: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79219: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10896 1726882159.79241: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79273: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 10896 1726882159.79329: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79366: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882159.79491: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2d3d940> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2d1af30> import 'ansible.module_utils.facts.system.user' # <<< 10896 1726882159.79562: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79565: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.79825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 10896 1726882159.79828: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.79913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 10896 1726882159.80017: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80114: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80157: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.80400: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 10896 1726882159.80562: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80673: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 10896 1726882159.80840: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.80901: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.81459: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.82148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 10896 1726882159.82151: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.82154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 10896 1726882159.82374: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 10896 1726882159.82504: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.82703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 10896 1726882159.82754: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.82814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 10896 1726882159.82827: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.82888: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83031: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83189: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10896 1726882159.83413: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83448: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 10896 1726882159.83559: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83579: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 10896 1726882159.83623: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 10896 1726882159.83719: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.83747: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 10896 1726882159.83889: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 10896 1726882159.83935: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.84003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 10896 1726882159.84267: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.84529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 10896 1726882159.84610: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.84656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 10896 1726882159.84701: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.84722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 10896 1726882159.84768: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.84816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available<<< 10896 1726882159.84862: stdout chunk (state=3): >>> <<< 10896 1726882159.84892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 10896 1726882159.84986: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 10896 1726882159.85088: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 10896 1726882159.85108: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 10896 1726882159.85237: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85250: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85291: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85336: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85408: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85517: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 10896 1726882159.85616: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 10896 1726882159.85801: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.85991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 10896 1726882159.86009: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86049: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86208: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882159.86211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 10896 1726882159.86213: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86284: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 10896 1726882159.86392: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86472: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10896 1726882159.86627: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882159.86817: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 10896 1726882159.86953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 10896 1726882159.86966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2b3a3f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b38dd0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b33dd0> <<< 10896 1726882159.97839: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b80260> <<< 10896 1726882159.97953: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 10896 1726882159.97956: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b81130> <<< 10896 1726882159.97986: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 10896 1726882159.98099: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b83530> <<< 10896 1726882159.98114: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b825a0> <<< 10896 1726882159.98268: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 10896 1726882160.21905: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", <<< 10896 1726882160.21936: stdout chunk (state=3): >>>"XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "19", "epoch": "1726882159", "epoch_int": "1726882159", "date": "2024-09-20", "time": "21:29:19", "iso8601_micro": "2024-09-21T01:29:19.877028Z", "iso8601": "2024-09-21T01:29:19Z", "iso8601_basic": "20240920T212919877028", "iso8601_basic_short": "20240920T212919", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.45849609375, "5m": 0.23486328125, "15m": 0.1103515625}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2974, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 557, "free": 2974}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 350, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793878016, "block_size": 4096, "block_total": 65519099, "block_available": 63914521, "block_used": 1604578, "inode_total": 131070960, "inode_available": 131029075, "inode_used": 41885, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "<<< 10896 1726882160.21957: stdout chunk (state=3): >>>tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10896 1726882160.22531: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10896 1726882160.22562: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 10896 1726882160.22599: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword <<< 10896 1726882160.22630: stdout chunk (state=3): >>># destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma <<< 10896 1726882160.22661: stdout chunk (state=3): >>># cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 10896 1726882160.22709: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common <<< 10896 1726882160.22739: stdout chunk (state=3): >>># destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters <<< 10896 1726882160.22780: stdout chunk (state=3): >>># destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic <<< 10896 1726882160.22868: stdout chunk (state=3): >>># cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 10896 1726882160.22995: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat <<< 10896 1726882160.23014: stdout chunk (state=3): >>># cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 10896 1726882160.23240: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10896 1726882160.23528: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10896 1726882160.23531: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 10896 1726882160.23586: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue <<< 10896 1726882160.23612: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 10896 1726882160.23641: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 10896 1726882160.23670: stdout chunk (state=3): >>># destroy _ssl <<< 10896 1726882160.23699: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 10896 1726882160.23753: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 10896 1726882160.23780: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 10896 1726882160.23850: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 10896 1726882160.23990: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 10896 1726882160.24018: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10896 1726882160.24135: stdout chunk (state=3): >>># destroy sys.monitoring <<< 10896 1726882160.24157: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 10896 1726882160.24179: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 10896 1726882160.24323: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 10896 1726882160.24339: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10896 1726882160.24388: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 10896 1726882160.24451: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 10896 1726882160.24476: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 10896 1726882160.24502: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 10896 1726882160.24546: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10896 1726882160.24843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882160.24866: stdout chunk (state=3): >>><<< 10896 1726882160.25109: stderr chunk (state=3): >>><<< 10896 1726882160.25148: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a6bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a6bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3aa37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3aa3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a83aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a811c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a68f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3ac0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a68200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3af8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3af8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3a66d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3af9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b10680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b11d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b12c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b12150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3b13ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3b13410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3857bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae38806e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3880440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3880710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3881040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3881a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38808f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3855d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3882de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3881b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3afaba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38af140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38cf500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae39302c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3932a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae39303e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38f92e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37393d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae38ce300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3883d10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faae38ce900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xc47bg__/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae377dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae377d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379cfb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37ce960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37ce750> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae379fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cf680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae37cfe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae312dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae312f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3130200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31313a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3133e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae37cfd10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3132030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313bcb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313a780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313a4e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae313aa50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31325a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae317ffb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3180140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3181bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3181970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3184170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31822a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3187950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3184320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188710> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3188b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3180320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30142c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30155b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae318aa50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae318be00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae318a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae30196d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301a510> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3015850> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301a1e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301b560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae3026000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae3023d40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae310e990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae31fe660> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30260c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae301be30> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30ba2d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c83f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c882f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30aad80> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30bae40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b8980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b8d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c8b2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8ab70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2c8ad50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c89fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8b380> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2cedeb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2c8be90> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae30b9400> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2cee6f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2cee900> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2d2a030> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2d19eb0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2d3d940> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2d1af30> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faae2b3a3f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b38dd0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b33dd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b80260> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b81130> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b83530> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faae2b825a0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "19", "epoch": "1726882159", "epoch_int": "1726882159", "date": "2024-09-20", "time": "21:29:19", "iso8601_micro": "2024-09-21T01:29:19.877028Z", "iso8601": "2024-09-21T01:29:19Z", "iso8601_basic": "20240920T212919877028", "iso8601_basic_short": "20240920T212919", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.45849609375, "5m": 0.23486328125, "15m": 0.1103515625}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2974, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 557, "free": 2974}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 350, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793878016, "block_size": 4096, "block_total": 65519099, "block_available": 63914521, "block_used": 1604578, "inode_total": 131070960, "inode_available": 131029075, "inode_used": 41885, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 10896 1726882160.27555: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882160.27559: _low_level_execute_command(): starting 10896 1726882160.27561: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882158.546461-10945-226747738681236/ > /dev/null 2>&1 && sleep 0' 10896 1726882160.28512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.28531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882160.28549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882160.28562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882160.28571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882160.28659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882160.29108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.29287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 10896 1726882160.31831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882160.31871: stderr chunk (state=3): >>><<< 10896 1726882160.31879: stdout chunk (state=3): >>><<< 10896 1726882160.31915: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 10896 1726882160.31927: handler run complete 10896 1726882160.32059: variable 'ansible_facts' from source: unknown 10896 1726882160.32152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.32538: variable 'ansible_facts' from source: unknown 10896 1726882160.32662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.32916: attempt loop complete, returning result 10896 1726882160.32919: _execute() done 10896 1726882160.32921: dumping result to json 10896 1726882160.32923: done dumping result, returning 10896 1726882160.32925: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-8b02-b216-0000000000cd] 10896 1726882160.32927: sending task result for task 12673a56-9f93-8b02-b216-0000000000cd ok: [managed_node2] 10896 1726882160.34057: no more pending results, returning what we have 10896 1726882160.34060: results queue empty 10896 1726882160.34061: checking for any_errors_fatal 10896 1726882160.34062: done checking for any_errors_fatal 10896 1726882160.34063: checking for max_fail_percentage 10896 1726882160.34064: done checking for max_fail_percentage 10896 1726882160.34065: checking to see if all hosts have failed and the running result is not ok 10896 1726882160.34066: done checking to see if all hosts have failed 10896 1726882160.34066: getting the remaining hosts for this loop 10896 1726882160.34068: done getting the remaining hosts for this loop 10896 1726882160.34071: getting the next task for host managed_node2 10896 1726882160.34076: done getting next task for host managed_node2 10896 1726882160.34077: ^ task is: TASK: meta (flush_handlers) 10896 1726882160.34079: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882160.34083: getting variables 10896 1726882160.34084: in VariableManager get_vars() 10896 1726882160.34107: Calling all_inventory to load vars for managed_node2 10896 1726882160.34110: Calling groups_inventory to load vars for managed_node2 10896 1726882160.34113: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882160.34119: done sending task result for task 12673a56-9f93-8b02-b216-0000000000cd 10896 1726882160.34121: WORKER PROCESS EXITING 10896 1726882160.34130: Calling all_plugins_play to load vars for managed_node2 10896 1726882160.34132: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882160.34135: Calling groups_plugins_play to load vars for managed_node2 10896 1726882160.34300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.34617: done with get_vars() 10896 1726882160.34627: done getting variables 10896 1726882160.34916: in VariableManager get_vars() 10896 1726882160.34926: Calling all_inventory to load vars for managed_node2 10896 1726882160.34929: Calling groups_inventory to load vars for managed_node2 10896 1726882160.34931: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882160.34937: Calling all_plugins_play to load vars for managed_node2 10896 1726882160.34939: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882160.34942: Calling groups_plugins_play to load vars for managed_node2 10896 1726882160.35075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.35452: done with get_vars() 10896 1726882160.35465: done queuing things up, now waiting for results queue to drain 10896 1726882160.35467: results queue empty 10896 1726882160.35468: checking for any_errors_fatal 10896 1726882160.35471: done checking for any_errors_fatal 10896 1726882160.35471: checking for max_fail_percentage 10896 1726882160.35472: done checking for max_fail_percentage 10896 1726882160.35477: checking to see if all hosts have failed and the running result is not ok 10896 1726882160.35478: done checking to see if all hosts have failed 10896 1726882160.35479: getting the remaining hosts for this loop 10896 1726882160.35480: done getting the remaining hosts for this loop 10896 1726882160.35482: getting the next task for host managed_node2 10896 1726882160.35487: done getting next task for host managed_node2 10896 1726882160.35489: ^ task is: TASK: Include the task 'el_repo_setup.yml' 10896 1726882160.35491: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882160.35692: getting variables 10896 1726882160.35695: in VariableManager get_vars() 10896 1726882160.35704: Calling all_inventory to load vars for managed_node2 10896 1726882160.35706: Calling groups_inventory to load vars for managed_node2 10896 1726882160.35709: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882160.35713: Calling all_plugins_play to load vars for managed_node2 10896 1726882160.35716: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882160.35718: Calling groups_plugins_play to load vars for managed_node2 10896 1726882160.35847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.36066: done with get_vars() 10896 1726882160.36073: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Friday 20 September 2024 21:29:20 -0400 (0:00:01.902) 0:00:01.928 ****** 10896 1726882160.36161: entering _queue_task() for managed_node2/include_tasks 10896 1726882160.36163: Creating lock for include_tasks 10896 1726882160.36462: worker is 1 (out of 1 available) 10896 1726882160.36474: exiting _queue_task() for managed_node2/include_tasks 10896 1726882160.36486: done queuing things up, now waiting for results queue to drain 10896 1726882160.36488: waiting for pending results... 10896 1726882160.36716: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 10896 1726882160.36830: in run() - task 12673a56-9f93-8b02-b216-000000000006 10896 1726882160.36851: variable 'ansible_search_path' from source: unknown 10896 1726882160.36891: calling self._execute() 10896 1726882160.36987: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882160.37002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882160.37018: variable 'omit' from source: magic vars 10896 1726882160.37150: _execute() done 10896 1726882160.37158: dumping result to json 10896 1726882160.37166: done dumping result, returning 10896 1726882160.37178: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-8b02-b216-000000000006] 10896 1726882160.37189: sending task result for task 12673a56-9f93-8b02-b216-000000000006 10896 1726882160.37399: done sending task result for task 12673a56-9f93-8b02-b216-000000000006 10896 1726882160.37402: WORKER PROCESS EXITING 10896 1726882160.37448: no more pending results, returning what we have 10896 1726882160.37453: in VariableManager get_vars() 10896 1726882160.37485: Calling all_inventory to load vars for managed_node2 10896 1726882160.37488: Calling groups_inventory to load vars for managed_node2 10896 1726882160.37492: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882160.37510: Calling all_plugins_play to load vars for managed_node2 10896 1726882160.37587: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882160.37592: Calling groups_plugins_play to load vars for managed_node2 10896 1726882160.37896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.38067: done with get_vars() 10896 1726882160.38075: variable 'ansible_search_path' from source: unknown 10896 1726882160.38087: we have included files to process 10896 1726882160.38089: generating all_blocks data 10896 1726882160.38090: done generating all_blocks data 10896 1726882160.38091: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10896 1726882160.38092: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10896 1726882160.38096: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10896 1726882160.38730: in VariableManager get_vars() 10896 1726882160.38745: done with get_vars() 10896 1726882160.38756: done processing included file 10896 1726882160.38758: iterating over new_blocks loaded from include file 10896 1726882160.38759: in VariableManager get_vars() 10896 1726882160.38767: done with get_vars() 10896 1726882160.38769: filtering new block on tags 10896 1726882160.38781: done filtering new block on tags 10896 1726882160.38784: in VariableManager get_vars() 10896 1726882160.38796: done with get_vars() 10896 1726882160.38797: filtering new block on tags 10896 1726882160.38811: done filtering new block on tags 10896 1726882160.38813: in VariableManager get_vars() 10896 1726882160.38823: done with get_vars() 10896 1726882160.38825: filtering new block on tags 10896 1726882160.38836: done filtering new block on tags 10896 1726882160.38837: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 10896 1726882160.38843: extending task lists for all hosts with included blocks 10896 1726882160.38885: done extending task lists 10896 1726882160.38886: done processing included files 10896 1726882160.38887: results queue empty 10896 1726882160.38887: checking for any_errors_fatal 10896 1726882160.38889: done checking for any_errors_fatal 10896 1726882160.38889: checking for max_fail_percentage 10896 1726882160.38890: done checking for max_fail_percentage 10896 1726882160.38891: checking to see if all hosts have failed and the running result is not ok 10896 1726882160.38892: done checking to see if all hosts have failed 10896 1726882160.38892: getting the remaining hosts for this loop 10896 1726882160.38895: done getting the remaining hosts for this loop 10896 1726882160.38897: getting the next task for host managed_node2 10896 1726882160.38901: done getting next task for host managed_node2 10896 1726882160.38903: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 10896 1726882160.38905: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882160.38907: getting variables 10896 1726882160.38907: in VariableManager get_vars() 10896 1726882160.38915: Calling all_inventory to load vars for managed_node2 10896 1726882160.38916: Calling groups_inventory to load vars for managed_node2 10896 1726882160.38918: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882160.38924: Calling all_plugins_play to load vars for managed_node2 10896 1726882160.38926: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882160.38928: Calling groups_plugins_play to load vars for managed_node2 10896 1726882160.39100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882160.39289: done with get_vars() 10896 1726882160.39299: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:29:20 -0400 (0:00:00.032) 0:00:01.961 ****** 10896 1726882160.39460: entering _queue_task() for managed_node2/setup 10896 1726882160.40096: worker is 1 (out of 1 available) 10896 1726882160.40108: exiting _queue_task() for managed_node2/setup 10896 1726882160.40119: done queuing things up, now waiting for results queue to drain 10896 1726882160.40120: waiting for pending results... 10896 1726882160.40542: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 10896 1726882160.40713: in run() - task 12673a56-9f93-8b02-b216-0000000000de 10896 1726882160.40900: variable 'ansible_search_path' from source: unknown 10896 1726882160.40905: variable 'ansible_search_path' from source: unknown 10896 1726882160.40908: calling self._execute() 10896 1726882160.40910: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882160.40914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882160.40916: variable 'omit' from source: magic vars 10896 1726882160.41428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882160.44485: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882160.44571: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882160.44623: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882160.44675: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882160.44708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882160.44799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882160.44843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882160.44875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882160.44946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882160.44949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882160.45117: variable 'ansible_facts' from source: unknown 10896 1726882160.45197: variable 'network_test_required_facts' from source: task vars 10896 1726882160.45243: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 10896 1726882160.45272: variable 'omit' from source: magic vars 10896 1726882160.45302: variable 'omit' from source: magic vars 10896 1726882160.45381: variable 'omit' from source: magic vars 10896 1726882160.45384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882160.45402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882160.45423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882160.45453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882160.45469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882160.45509: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882160.45518: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882160.45599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882160.45629: Set connection var ansible_connection to ssh 10896 1726882160.45640: Set connection var ansible_timeout to 10 10896 1726882160.45647: Set connection var ansible_shell_type to sh 10896 1726882160.45660: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882160.45670: Set connection var ansible_shell_executable to /bin/sh 10896 1726882160.45679: Set connection var ansible_pipelining to False 10896 1726882160.45713: variable 'ansible_shell_executable' from source: unknown 10896 1726882160.45721: variable 'ansible_connection' from source: unknown 10896 1726882160.45729: variable 'ansible_module_compression' from source: unknown 10896 1726882160.45735: variable 'ansible_shell_type' from source: unknown 10896 1726882160.45741: variable 'ansible_shell_executable' from source: unknown 10896 1726882160.45747: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882160.45754: variable 'ansible_pipelining' from source: unknown 10896 1726882160.45760: variable 'ansible_timeout' from source: unknown 10896 1726882160.45767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882160.45910: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882160.45931: variable 'omit' from source: magic vars 10896 1726882160.45998: starting attempt loop 10896 1726882160.46001: running the handler 10896 1726882160.46003: _low_level_execute_command(): starting 10896 1726882160.46005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882160.46701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.46723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882160.46809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.46828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882160.46845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882160.46861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.46955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882160.49239: stdout chunk (state=3): >>>/root <<< 10896 1726882160.49368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882160.49380: stdout chunk (state=3): >>><<< 10896 1726882160.49390: stderr chunk (state=3): >>><<< 10896 1726882160.49414: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882160.49578: _low_level_execute_command(): starting 10896 1726882160.49581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137 `" && echo ansible-tmp-1726882160.494495-11022-142652196308137="` echo /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137 `" ) && sleep 0' 10896 1726882160.50890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882160.50895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882160.50898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882160.50901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882160.50903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882160.50905: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882160.50919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.50985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882160.51135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.51222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882160.53833: stdout chunk (state=3): >>>ansible-tmp-1726882160.494495-11022-142652196308137=/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137 <<< 10896 1726882160.54005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882160.54044: stderr chunk (state=3): >>><<< 10896 1726882160.54058: stdout chunk (state=3): >>><<< 10896 1726882160.54082: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882160.494495-11022-142652196308137=/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882160.54401: variable 'ansible_module_compression' from source: unknown 10896 1726882160.54404: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10896 1726882160.54406: variable 'ansible_facts' from source: unknown 10896 1726882160.54662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py 10896 1726882160.55115: Sending initial data 10896 1726882160.55238: Sent initial data (153 bytes) 10896 1726882160.56396: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882160.56645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882160.56712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882160.56731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.56834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882160.59235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882160.59300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882160.59372: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpbq_q1hd4 /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py <<< 10896 1726882160.59473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py" <<< 10896 1726882160.59500: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpbq_q1hd4" to remote "/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py" <<< 10896 1726882160.62405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882160.62482: stderr chunk (state=3): >>><<< 10896 1726882160.62491: stdout chunk (state=3): >>><<< 10896 1726882160.62521: done transferring module to remote 10896 1726882160.62586: _low_level_execute_command(): starting 10896 1726882160.62600: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/ /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py && sleep 0' 10896 1726882160.63838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882160.63852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882160.63908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.64059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882160.64421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.64814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882160.66681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882160.66685: stdout chunk (state=3): >>><<< 10896 1726882160.66688: stderr chunk (state=3): >>><<< 10896 1726882160.66690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882160.66697: _low_level_execute_command(): starting 10896 1726882160.66699: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/AnsiballZ_setup.py && sleep 0' 10896 1726882160.67909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882160.67923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.68030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882160.68202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882160.68218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882160.68971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882160.71746: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 10896 1726882160.71749: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 10896 1726882160.71751: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4526184d0> <<< 10896 1726882160.71753: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525e7b30> <<< 10896 1726882160.71912: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45261aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # <<< 10896 1726882160.71925: stdout chunk (state=3): >>>import 'stat' # <<< 10896 1726882160.72036: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10896 1726882160.72068: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10896 1726882160.72113: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 10896 1726882160.72146: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 10896 1726882160.72412: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45242d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45242dfa0> import 'site' # <<< 10896 1726882160.72488: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10896 1726882160.73087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45246be90> <<< 10896 1726882160.73109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10896 1726882160.73127: stdout chunk (state=3): >>>import '_operator' # <<< 10896 1726882160.73141: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45246bf50> <<< 10896 1726882160.73153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10896 1726882160.73189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10896 1726882160.73205: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10896 1726882160.73303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524a3830> <<< 10896 1726882160.73321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 10896 1726882160.73339: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524a3ec0> <<< 10896 1726882160.73353: stdout chunk (state=3): >>>import '_collections' # <<< 10896 1726882160.73583: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452481280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452469040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 10896 1726882160.73686: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10896 1726882160.73692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c23f0> <<< 10896 1726882160.73728: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c0c20> <<< 10896 1726882160.73800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524682c0> <<< 10896 1726882160.73814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10896 1726882160.73843: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.73855: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4524f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f8bc0> <<< 10896 1726882160.73905: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4524f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452466de0> <<< 10896 1726882160.73923: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 10896 1726882160.73947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10896 1726882160.74081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fa510> <<< 10896 1726882160.74087: stdout chunk (state=3): >>>import 'importlib.util' # <<< 10896 1726882160.74089: stdout chunk (state=3): >>>import 'runpy' # <<< 10896 1726882160.74202: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452510710> import 'errno' # <<< 10896 1726882160.74238: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452511df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10896 1726882160.74278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452512c90> <<< 10896 1726882160.74400: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4525132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452513d70> <<< 10896 1726882160.74517: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10896 1726882160.74520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10896 1726882160.74536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10896 1726882160.74591: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.74606: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45222fbf0> <<< 10896 1726882160.74609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 10896 1726882160.74612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10896 1726882160.74647: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4522586e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452258440> <<< 10896 1726882160.74650: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452258710> <<< 10896 1726882160.74721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10896 1726882160.74757: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.74903: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452259040> <<< 10896 1726882160.74980: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.75011: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4522599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522588f0> <<< 10896 1726882160.75015: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45222dd90> <<< 10896 1726882160.75157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45225adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452259af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10896 1726882160.75368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452287110> <<< 10896 1726882160.75384: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10896 1726882160.75438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522a74a0> <<< 10896 1726882160.75507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10896 1726882160.75621: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452308260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10896 1726882160.75636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10896 1726882160.75760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45230a9c0> <<< 10896 1726882160.75925: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452308380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522d1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452109340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522a62a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45225bce0> <<< 10896 1726882160.76217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa4521095b0> <<< 10896 1726882160.76446: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_r9jv_7a6/ansible_setup_payload.zip' # zipimport: zlib available <<< 10896 1726882160.76473: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.76509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10896 1726882160.76559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10896 1726882160.76804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521730b0> import '_typing' # <<< 10896 1726882160.76850: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452151fa0> <<< 10896 1726882160.76857: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452151160> # zipimport: zlib available <<< 10896 1726882160.76887: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 10896 1726882160.77011: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 10896 1726882160.78410: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.79414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 10896 1726882160.79418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452170f80> <<< 10896 1726882160.79421: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882160.79521: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a29c0> <<< 10896 1726882160.79615: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10896 1726882160.79638: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452173d40> <<< 10896 1726882160.79645: stdout chunk (state=3): >>>import 'atexit' # <<< 10896 1726882160.79709: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a3980> <<< 10896 1726882160.79721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10896 1726882160.79915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b29b20> <<< 10896 1726882160.79950: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b2b7a0> <<< 10896 1726882160.79990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10896 1726882160.79998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10896 1726882160.80141: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2c140> <<< 10896 1726882160.80717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2d2b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4523081d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2e030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b37ad0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b365a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b36300> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b36870> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7bda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 10896 1726882160.80728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7bf20> <<< 10896 1726882160.80730: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10896 1726882160.80748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10896 1726882160.80763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10896 1726882160.80815: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.80819: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7d7c0> <<< 10896 1726882160.80822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10896 1726882160.81160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7ff80> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7e0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b836b0> <<< 10896 1726882160.81163: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7fef0> <<< 10896 1726882160.81215: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b849e0> <<< 10896 1726882160.81240: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b84530> <<< 10896 1726882160.81481: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b84b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a101a0> <<< 10896 1726882160.81609: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a112b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b86930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.81616: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b87ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b86570> # zipimport: zlib available <<< 10896 1726882160.81629: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10896 1726882160.81651: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.81909: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.81931: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 10896 1726882160.81998: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.82115: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.82627: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.83169: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 10896 1726882160.83271: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a154f0> <<< 10896 1726882160.83339: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10896 1726882160.83368: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a162a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a11430> <<< 10896 1726882160.83407: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10896 1726882160.83435: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.83479: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10896 1726882160.83609: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.83761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 10896 1726882160.83782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a16390> # zipimport: zlib available <<< 10896 1726882160.84237: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.84675: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.84734: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.84814: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 10896 1726882160.84909: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 10896 1726882160.84965: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.85044: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10896 1726882160.85126: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.85160: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10896 1726882160.85183: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.85389: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.85658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10896 1726882160.85713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10896 1726882160.85767: stdout chunk (state=3): >>>import '_ast' # <<< 10896 1726882160.85813: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a175f0> # zipimport: zlib available <<< 10896 1726882160.85991: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86003: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10896 1726882160.86006: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 10896 1726882160.86029: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 10896 1726882160.86066: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86116: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86203: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86235: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10896 1726882160.86346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882160.86350: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a21f70> <<< 10896 1726882160.86378: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a1cef0> <<< 10896 1726882160.86436: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 10896 1726882160.86511: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86541: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.86641: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882160.86644: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10896 1726882160.86764: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10896 1726882160.86770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10896 1726882160.86810: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b0a8d0> <<< 10896 1726882160.86859: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521ce5a0> <<< 10896 1726882160.87018: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a22030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a17350> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10896 1726882160.87046: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87069: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 10896 1726882160.87108: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 10896 1726882160.87156: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87286: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.87309: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87370: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87383: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 10896 1726882160.87489: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87618: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.87622: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 10896 1726882160.87806: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.87968: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.88019: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.88069: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882160.88088: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 10896 1726882160.88122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10896 1726882160.88144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 10896 1726882160.88224: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab2000> <<< 10896 1726882160.88244: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10896 1726882160.88303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10896 1726882160.88440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451653f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.88472: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166c290> <<< 10896 1726882160.88485: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a9a9f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab2b40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab02f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10896 1726882160.88529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 10896 1726882160.88548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10896 1726882160.88597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10896 1726882160.88684: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166eb40> <<< 10896 1726882160.88688: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166ed20> <<< 10896 1726882160.88738: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10896 1726882160.88797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10896 1726882160.88972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4516b9fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab0410> import 'ansible.module_utils.facts.timeout' # <<< 10896 1726882160.88979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 10896 1726882160.89011: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.89043: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.89118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 10896 1726882160.89636: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.89639: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.89706: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.89757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 10896 1726882160.89772: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.90234: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.90788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.90814: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.90881: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10896 1726882160.90884: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.90990: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.91045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10896 1726882160.91059: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91087: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10896 1726882160.91189: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 10896 1726882160.91258: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91280: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 10896 1726882160.91404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10896 1726882160.91487: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516ba1e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10896 1726882160.91507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10896 1726882160.91566: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516badb0> import 'ansible.module_utils.facts.system.local' # <<< 10896 1726882160.91569: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91661: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 10896 1726882160.91805: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.91925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 10896 1726882160.91975: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.92079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 10896 1726882160.92092: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.92132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10896 1726882160.92173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10896 1726882160.92243: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.92304: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4516fa390> <<< 10896 1726882160.92511: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516ea090> <<< 10896 1726882160.92519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 10896 1726882160.92571: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.92652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 10896 1726882160.92702: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.92790: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.92900: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.93084: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10896 1726882160.93103: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.93199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 10896 1726882160.93210: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.93239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10896 1726882160.93334: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882160.93343: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45170de80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45170da90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 10896 1726882160.93345: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.93424: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 10896 1726882160.93439: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.93616: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.93925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 10896 1726882160.93939: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.93981: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 10896 1726882160.94053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 10896 1726882160.94150: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94173: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94216: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 10896 1726882160.94498: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10896 1726882160.94639: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.94701: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.95249: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.95757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 10896 1726882160.95768: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.95881: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.95997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 10896 1726882160.96004: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96227: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 10896 1726882160.96328: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10896 1726882160.96514: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 10896 1726882160.96533: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96573: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 10896 1726882160.96631: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96714: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.96813: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97010: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10896 1726882160.97243: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97280: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97327: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.97368: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 10896 1726882160.97584: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 10896 1726882160.97590: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97628: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 10896 1726882160.97770: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.97887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 10896 1726882160.98546: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 10896 1726882160.98549: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98573: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10896 1726882160.98638: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98659: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 10896 1726882160.98774: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 10896 1726882160.98883: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 10896 1726882160.98904: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98949: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.98987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 10896 1726882160.99011: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99037: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99078: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99197: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882160.99298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 10896 1726882160.99302: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99346: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 10896 1726882160.99419: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99597: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 10896 1726882160.99803: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99842: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99885: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 10896 1726882160.99939: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882160.99985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 10896 1726882161.00005: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.00074: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.00146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 10896 1726882161.00291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 10896 1726882161.00344: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10896 1726882161.00413: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.01446: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45150fa40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45150c5c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45150cec0> <<< 10896 1726882161.02013: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "21", "epoch": "1726882161", "epoch_int": "1726882161", "date": "2024-09-20", "time": "21:29:21", "iso8601_micro": "2024-09-21T01:29:21.011105Z", "iso8601": "2024-09-21T01:29:21Z", "iso8601_basic": "20240920T212921011105", "iso8601_basic_short": "20240920T212921", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10896 1726882161.02328: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 10896 1726882161.02349: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 10896 1726882161.02368: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc <<< 10896 1726882161.02383: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 10896 1726882161.02413: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii <<< 10896 1726882161.02539: stdout chunk (state=3): >>># cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext <<< 10896 1726882161.02543: stdout chunk (state=3): >>># destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 10896 1726882161.02723: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 10896 1726882161.03105: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder <<< 10896 1726882161.03199: stdout chunk (state=3): >>># destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 10896 1726882161.03409: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 10896 1726882161.03630: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 10896 1726882161.03792: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10896 1726882161.03826: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 10896 1726882161.03885: stdout chunk (state=3): >>># destroy _collections <<< 10896 1726882161.03892: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 10896 1726882161.03902: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 10896 1726882161.03907: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 10896 1726882161.03910: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 10896 1726882161.04053: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 10896 1726882161.04078: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 10896 1726882161.04216: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 10896 1726882161.04501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882161.04504: stderr chunk (state=3): >>><<< 10896 1726882161.04507: stdout chunk (state=3): >>><<< 10896 1726882161.05026: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4526184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45261aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45242d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45242dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45246be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45246bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452481280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452469040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4524f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4524f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452466de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452510710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452511df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452512c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4525132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4525134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45222fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4522586e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452258440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452258710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa452259040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4522599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522588f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45222dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45225adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452259af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4524fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452287110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522a74a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452308260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45230a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452308380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522d1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452109340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4522a62a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45225bce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa4521095b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_r9jv_7a6/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521730b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452151fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452151160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452170f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa452173d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4521a3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521a3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b29b20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b2b7a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2c140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2d2b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4523081d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2e030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b37ad0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b365a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b36300> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b36870> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b2e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7bda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b7ff80> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7e0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b836b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7fef0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b849e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b84530> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b84b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b7c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a101a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a112b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b86930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451b87ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b86570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a154f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a162a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a11430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a16390> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a175f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa451a21f70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a1cef0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451b0a8d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4521ce5a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a22030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a17350> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab2000> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451653f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166c290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451a9a9f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab2b40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab02f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166eb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45166ed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4516b9fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45166ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa451ab0410> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516ba1e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516badb0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4516fa390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4516ea090> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45170de80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45170da90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa45150fa40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45150c5c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa45150cec0> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "21", "epoch": "1726882161", "epoch_int": "1726882161", "date": "2024-09-20", "time": "21:29:21", "iso8601_micro": "2024-09-21T01:29:21.011105Z", "iso8601": "2024-09-21T01:29:21Z", "iso8601_basic": "20240920T212921011105", "iso8601_basic_short": "20240920T212921", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10896 1726882161.07230: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882161.07233: _low_level_execute_command(): starting 10896 1726882161.07235: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882160.494495-11022-142652196308137/ > /dev/null 2>&1 && sleep 0' 10896 1726882161.07252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.07456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.07562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.08014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882161.09738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.09773: stderr chunk (state=3): >>><<< 10896 1726882161.09783: stdout chunk (state=3): >>><<< 10896 1726882161.09983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882161.09986: handler run complete 10896 1726882161.10042: variable 'ansible_facts' from source: unknown 10896 1726882161.10310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.10567: variable 'ansible_facts' from source: unknown 10896 1726882161.10799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.11240: attempt loop complete, returning result 10896 1726882161.11244: _execute() done 10896 1726882161.11246: dumping result to json 10896 1726882161.11249: done dumping result, returning 10896 1726882161.11251: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-8b02-b216-0000000000de] 10896 1726882161.11254: sending task result for task 12673a56-9f93-8b02-b216-0000000000de 10896 1726882161.11707: done sending task result for task 12673a56-9f93-8b02-b216-0000000000de 10896 1726882161.11711: WORKER PROCESS EXITING ok: [managed_node2] 10896 1726882161.11842: no more pending results, returning what we have 10896 1726882161.11845: results queue empty 10896 1726882161.11846: checking for any_errors_fatal 10896 1726882161.11848: done checking for any_errors_fatal 10896 1726882161.11849: checking for max_fail_percentage 10896 1726882161.11850: done checking for max_fail_percentage 10896 1726882161.11851: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.11852: done checking to see if all hosts have failed 10896 1726882161.11852: getting the remaining hosts for this loop 10896 1726882161.11854: done getting the remaining hosts for this loop 10896 1726882161.11858: getting the next task for host managed_node2 10896 1726882161.11868: done getting next task for host managed_node2 10896 1726882161.11871: ^ task is: TASK: Check if system is ostree 10896 1726882161.11874: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.11878: getting variables 10896 1726882161.11879: in VariableManager get_vars() 10896 1726882161.12514: Calling all_inventory to load vars for managed_node2 10896 1726882161.12517: Calling groups_inventory to load vars for managed_node2 10896 1726882161.12520: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.12530: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.12533: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.12536: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.13141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.13552: done with get_vars() 10896 1726882161.13564: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:29:21 -0400 (0:00:00.744) 0:00:02.705 ****** 10896 1726882161.13883: entering _queue_task() for managed_node2/stat 10896 1726882161.14630: worker is 1 (out of 1 available) 10896 1726882161.14642: exiting _queue_task() for managed_node2/stat 10896 1726882161.14651: done queuing things up, now waiting for results queue to drain 10896 1726882161.14652: waiting for pending results... 10896 1726882161.15605: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 10896 1726882161.15612: in run() - task 12673a56-9f93-8b02-b216-0000000000e0 10896 1726882161.15615: variable 'ansible_search_path' from source: unknown 10896 1726882161.15617: variable 'ansible_search_path' from source: unknown 10896 1726882161.15918: calling self._execute() 10896 1726882161.15922: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.16038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.16201: variable 'omit' from source: magic vars 10896 1726882161.17159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882161.17529: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882161.17684: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882161.17779: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882161.17836: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882161.18031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882161.18144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882161.18170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882161.18280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882161.18586: Evaluated conditional (not __network_is_ostree is defined): True 10896 1726882161.18589: variable 'omit' from source: magic vars 10896 1726882161.18603: variable 'omit' from source: magic vars 10896 1726882161.18976: variable 'omit' from source: magic vars 10896 1726882161.19000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882161.19127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882161.19168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882161.19171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882161.19284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882161.19316: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882161.19319: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.19321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.19602: Set connection var ansible_connection to ssh 10896 1726882161.19605: Set connection var ansible_timeout to 10 10896 1726882161.19607: Set connection var ansible_shell_type to sh 10896 1726882161.19610: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882161.19612: Set connection var ansible_shell_executable to /bin/sh 10896 1726882161.19614: Set connection var ansible_pipelining to False 10896 1726882161.19616: variable 'ansible_shell_executable' from source: unknown 10896 1726882161.19618: variable 'ansible_connection' from source: unknown 10896 1726882161.19622: variable 'ansible_module_compression' from source: unknown 10896 1726882161.19624: variable 'ansible_shell_type' from source: unknown 10896 1726882161.19626: variable 'ansible_shell_executable' from source: unknown 10896 1726882161.19628: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.19630: variable 'ansible_pipelining' from source: unknown 10896 1726882161.19632: variable 'ansible_timeout' from source: unknown 10896 1726882161.19634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.19901: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882161.19905: variable 'omit' from source: magic vars 10896 1726882161.19907: starting attempt loop 10896 1726882161.19910: running the handler 10896 1726882161.19912: _low_level_execute_command(): starting 10896 1726882161.19914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882161.21187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882161.21206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882161.21226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882161.21244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882161.21261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882161.21276: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882161.21298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.21345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.21413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.21457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882161.21486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.21661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882161.23879: stdout chunk (state=3): >>>/root <<< 10896 1726882161.24064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.24080: stdout chunk (state=3): >>><<< 10896 1726882161.24108: stderr chunk (state=3): >>><<< 10896 1726882161.24138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882161.24200: _low_level_execute_command(): starting 10896 1726882161.24204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529 `" && echo ansible-tmp-1726882161.2415164-11066-27057914208529="` echo /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529 `" ) && sleep 0' 10896 1726882161.24810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882161.24825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882161.24850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882161.24870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882161.24888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882161.24968: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.25014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.25031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882161.25060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.25283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882161.27929: stdout chunk (state=3): >>>ansible-tmp-1726882161.2415164-11066-27057914208529=/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529 <<< 10896 1726882161.28127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.28138: stdout chunk (state=3): >>><<< 10896 1726882161.28150: stderr chunk (state=3): >>><<< 10896 1726882161.28175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882161.2415164-11066-27057914208529=/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882161.28235: variable 'ansible_module_compression' from source: unknown 10896 1726882161.28498: ANSIBALLZ: Using lock for stat 10896 1726882161.28502: ANSIBALLZ: Acquiring lock 10896 1726882161.28504: ANSIBALLZ: Lock acquired: 139646160837552 10896 1726882161.28507: ANSIBALLZ: Creating module 10896 1726882161.43236: ANSIBALLZ: Writing module into payload 10896 1726882161.43352: ANSIBALLZ: Writing module 10896 1726882161.43387: ANSIBALLZ: Renaming module 10896 1726882161.43404: ANSIBALLZ: Done creating module 10896 1726882161.43439: variable 'ansible_facts' from source: unknown 10896 1726882161.43551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py 10896 1726882161.43875: Sending initial data 10896 1726882161.43878: Sent initial data (152 bytes) 10896 1726882161.44617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.44675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.44688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882161.44713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.44855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882161.47144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882161.47198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882161.47287: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpvwj87cln /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py <<< 10896 1726882161.47290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py" <<< 10896 1726882161.47416: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpvwj87cln" to remote "/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py" <<< 10896 1726882161.48738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.48769: stderr chunk (state=3): >>><<< 10896 1726882161.48779: stdout chunk (state=3): >>><<< 10896 1726882161.48851: done transferring module to remote 10896 1726882161.49091: _low_level_execute_command(): starting 10896 1726882161.49097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/ /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py && sleep 0' 10896 1726882161.50038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.50076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.50213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882161.50231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.50344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882161.52458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.52461: stdout chunk (state=3): >>><<< 10896 1726882161.52463: stderr chunk (state=3): >>><<< 10896 1726882161.52478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882161.52486: _low_level_execute_command(): starting 10896 1726882161.52497: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/AnsiballZ_stat.py && sleep 0' 10896 1726882161.53812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882161.53866: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882161.53870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.53935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882161.53939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882161.53941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.54021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882161.56437: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 10896 1726882161.56454: stdout chunk (state=3): >>>import 'posix' # <<< 10896 1726882161.56485: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10896 1726882161.56512: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 10896 1726882161.56579: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.56617: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 10896 1726882161.56695: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2e04d0> <<< 10896 1726882161.56773: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2afb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2e2a50> import '_signal' # import '_abc' # <<< 10896 1726882161.56798: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 10896 1726882161.56832: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10896 1726882161.57154: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e091130> <<< 10896 1726882161.57165: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.57168: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e091fa0> <<< 10896 1726882161.57188: stdout chunk (state=3): >>>import 'site' # <<< 10896 1726882161.57313: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10896 1726882161.57624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cfe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10896 1726882161.57650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10896 1726882161.57666: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cff20> <<< 10896 1726882161.57685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10896 1726882161.57715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10896 1726882161.57726: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10896 1726882161.57786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.57803: stdout chunk (state=3): >>>import 'itertools' # <<< 10896 1726882161.57833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e107890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 10896 1726882161.57860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e107f20> <<< 10896 1726882161.57963: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e7b30> <<< 10896 1726882161.57967: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e5250> <<< 10896 1726882161.58076: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10896 1726882161.58395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10896 1726882161.58399: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10896 1726882161.58405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e127800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e126450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e124cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 10896 1726882161.58408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10896 1726882161.58411: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e15cd10> <<< 10896 1726882161.58551: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e15cfb0> <<< 10896 1726882161.58554: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15d6a0> <<< 10896 1726882161.58560: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15d370> import 'importlib.machinery' # <<< 10896 1726882161.58589: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 10896 1726882161.58609: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e5a0> <<< 10896 1726882161.59024: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e1747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e175e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e176d20> <<< 10896 1726882161.59030: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e177320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e176270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e177da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e1774d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10896 1726882161.59066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10896 1726882161.59069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10896 1726882161.59109: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233def3bf0> <<< 10896 1726882161.59135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10896 1726882161.59166: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1c4a0> <<< 10896 1726882161.59200: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1c680> <<< 10896 1726882161.59299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10896 1726882161.59316: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882161.59633: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233def1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 10896 1726882161.59647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1ed20> <<< 10896 1726882161.59676: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1da60> <<< 10896 1726882161.59695: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e750> <<< 10896 1726882161.59720: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10896 1726882161.59792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.59806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10896 1726882161.59836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10896 1726882161.59861: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df47080> <<< 10896 1726882161.60348: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df6b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfcc230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10896 1726882161.60356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10896 1726882161.60358: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfce990> <<< 10896 1726882161.60423: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfcc350> <<< 10896 1726882161.60459: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df99250> <<< 10896 1726882161.60487: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d90d310> <<< 10896 1726882161.60507: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df6a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1fc50> <<< 10896 1726882161.60838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f233d90d5b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_4pzghb37/ansible_stat_payload.zip' # zipimport: zlib available <<< 10896 1726882161.60985: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.61009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10896 1726882161.61052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10896 1726882161.61169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d962fc0> import '_typing' # <<< 10896 1726882161.61617: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d941eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9410a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 10896 1726882161.62877: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.64012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9612b0> <<< 10896 1726882161.64038: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.64067: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 10896 1726882161.64092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10896 1726882161.64121: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10896 1726882161.64134: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98e960> <<< 10896 1726882161.64175: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e6f0> <<< 10896 1726882161.64236: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e030> <<< 10896 1726882161.64240: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10896 1726882161.64287: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d963c50> import 'atexit' # <<< 10896 1726882161.64319: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98f710> <<< 10896 1726882161.64351: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98f950> <<< 10896 1726882161.64374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10896 1726882161.64547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98fe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10896 1726882161.64573: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7f9c70> <<< 10896 1726882161.64610: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d7fb890> <<< 10896 1726882161.64637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10896 1726882161.64656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10896 1726882161.64696: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fc260> <<< 10896 1726882161.64711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10896 1726882161.64762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10896 1726882161.64767: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fd400> <<< 10896 1726882161.64779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10896 1726882161.64864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10896 1726882161.64874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10896 1726882161.64912: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7ffec0> <<< 10896 1726882161.64922: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e0cfd10> <<< 10896 1726882161.65131: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fe180> <<< 10896 1726882161.65134: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10896 1726882161.65136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10896 1726882161.65239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 10896 1726882161.65243: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d807e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d806990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d8066f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10896 1726882161.65288: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d806c30> <<< 10896 1726882161.65317: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fe690> <<< 10896 1726882161.65347: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d84ff50> <<< 10896 1726882161.65372: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 10896 1726882161.65390: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d850260> <<< 10896 1726882161.65407: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10896 1726882161.65445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10896 1726882161.65492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10896 1726882161.65513: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d851d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d851af0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10896 1726882161.65677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10896 1726882161.65687: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8542c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d852420> <<< 10896 1726882161.65703: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10896 1726882161.65895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d857aa0> <<< 10896 1726882161.65945: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d854470> <<< 10896 1726882161.66019: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858a40> <<< 10896 1726882161.66078: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858ad0> <<< 10896 1726882161.66508: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858d10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d850440> <<< 10896 1726882161.66511: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10896 1726882161.66517: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8e0290> <<< 10896 1726882161.66520: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8e1580> <<< 10896 1726882161.66522: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d85aa20> <<< 10896 1726882161.66524: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d85bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d85a690> <<< 10896 1726882161.66526: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 10896 1726882161.66559: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.66662: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10896 1726882161.66673: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 10896 1726882161.66776: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 10896 1726882161.66823: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.66944: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.67598: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.68005: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10896 1726882161.68026: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 10896 1726882161.68201: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d6ed6d0> <<< 10896 1726882161.68205: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10896 1726882161.68257: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ee540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d8e1730> <<< 10896 1726882161.68473: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10896 1726882161.68476: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.68620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10896 1726882161.68633: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ee4b0> <<< 10896 1726882161.68648: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69095: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69530: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69603: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69794: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 10896 1726882161.69834: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69921: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 10896 1726882161.69937: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.69947: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 10896 1726882161.69964: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70002: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70037: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10896 1726882161.70048: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70334: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10896 1726882161.70555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10896 1726882161.70566: stdout chunk (state=3): >>>import '_ast' # <<< 10896 1726882161.70630: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ef6b0> <<< 10896 1726882161.70652: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70715: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.70870: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10896 1726882161.70921: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 10896 1726882161.70961: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.71006: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.71144: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10896 1726882161.71189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10896 1726882161.71239: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d6fa210> <<< 10896 1726882161.71707: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6f5970> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10896 1726882161.71711: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10896 1726882161.71812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10896 1726882161.71835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9e6b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9d67e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6fa2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d858d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10896 1726882161.71871: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.71882: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.71913: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 10896 1726882161.71924: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 10896 1726882161.71970: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10896 1726882161.72029: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.72096: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 10896 1726882161.72101: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.72139: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.72542: stdout chunk (state=3): >>># zipimport: zlib available <<< 10896 1726882161.72545: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 10896 1726882161.72781: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 10896 1726882161.72812: stdout chunk (state=3): >>> # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools <<< 10896 1726882161.72938: stdout chunk (state=3): >>># cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 10896 1726882161.73171: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10896 1726882161.73226: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 10896 1726882161.73349: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 10896 1726882161.73372: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 10896 1726882161.73408: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 10896 1726882161.73440: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 10896 1726882161.73466: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 10896 1726882161.73489: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 10896 1726882161.73509: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 10896 1726882161.73600: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 10896 1726882161.73604: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10896 1726882161.73800: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10896 1726882161.73898: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 10896 1726882161.73919: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random <<< 10896 1726882161.73999: stdout chunk (state=3): >>># destroy _weakref # destroy _hashlib # destroy _operator <<< 10896 1726882161.74003: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools <<< 10896 1726882161.74006: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 10896 1726882161.74008: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10896 1726882161.74374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882161.74383: stdout chunk (state=3): >>><<< 10896 1726882161.74601: stderr chunk (state=3): >>><<< 10896 1726882161.74615: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2e04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2afb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e2e2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e091130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e091fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cfe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e107890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e107f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e7b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e5250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e127800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e126450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0e6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e124cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e15cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e15cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e0cadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e1747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e175e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e176d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e177320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e176270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e177da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e1774d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233def3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233df1d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233def1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1da60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233e15e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df47080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df6b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfcc230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfce990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233dfcc350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df99250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d90d310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df6a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233df1fc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f233d90d5b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_4pzghb37/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d962fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d941eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9410a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9612b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98e960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98e480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d963c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98f710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d98f950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d98fe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7f9c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d7fb890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fc260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fd400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7ffec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233e0cfd10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fe180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d807e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d806990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d8066f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d806c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d7fe690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d84ff50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d850260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d851d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d851af0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8542c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d852420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d857aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d854470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d858d10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d850440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8e0290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d8e1580> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d85aa20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d85bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d85a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d6ed6d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ee540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d8e1730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ee4b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6ef6b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f233d6fa210> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6f5970> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9e6b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d9d67e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d6fa2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f233d858d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10896 1726882161.76518: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882161.76522: _low_level_execute_command(): starting 10896 1726882161.76528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882161.2415164-11066-27057914208529/ > /dev/null 2>&1 && sleep 0' 10896 1726882161.76530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882161.76533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882161.76535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882161.76537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882161.76540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882161.76557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882161.79502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882161.79505: stdout chunk (state=3): >>><<< 10896 1726882161.79508: stderr chunk (state=3): >>><<< 10896 1726882161.79510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882161.79512: handler run complete 10896 1726882161.79513: attempt loop complete, returning result 10896 1726882161.79515: _execute() done 10896 1726882161.79516: dumping result to json 10896 1726882161.79518: done dumping result, returning 10896 1726882161.79520: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [12673a56-9f93-8b02-b216-0000000000e0] 10896 1726882161.79521: sending task result for task 12673a56-9f93-8b02-b216-0000000000e0 10896 1726882161.79586: done sending task result for task 12673a56-9f93-8b02-b216-0000000000e0 10896 1726882161.79589: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 10896 1726882161.79663: no more pending results, returning what we have 10896 1726882161.79665: results queue empty 10896 1726882161.79666: checking for any_errors_fatal 10896 1726882161.79671: done checking for any_errors_fatal 10896 1726882161.79672: checking for max_fail_percentage 10896 1726882161.79674: done checking for max_fail_percentage 10896 1726882161.79674: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.79675: done checking to see if all hosts have failed 10896 1726882161.79676: getting the remaining hosts for this loop 10896 1726882161.79677: done getting the remaining hosts for this loop 10896 1726882161.79680: getting the next task for host managed_node2 10896 1726882161.79686: done getting next task for host managed_node2 10896 1726882161.79688: ^ task is: TASK: Set flag to indicate system is ostree 10896 1726882161.79690: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.79904: getting variables 10896 1726882161.79909: in VariableManager get_vars() 10896 1726882161.79938: Calling all_inventory to load vars for managed_node2 10896 1726882161.79941: Calling groups_inventory to load vars for managed_node2 10896 1726882161.79945: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.79955: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.79957: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.79960: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.80456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.80948: done with get_vars() 10896 1726882161.80961: done getting variables 10896 1726882161.81266: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:29:21 -0400 (0:00:00.674) 0:00:03.380 ****** 10896 1726882161.81323: entering _queue_task() for managed_node2/set_fact 10896 1726882161.81325: Creating lock for set_fact 10896 1726882161.81812: worker is 1 (out of 1 available) 10896 1726882161.81824: exiting _queue_task() for managed_node2/set_fact 10896 1726882161.81836: done queuing things up, now waiting for results queue to drain 10896 1726882161.81837: waiting for pending results... 10896 1726882161.82035: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 10896 1726882161.82192: in run() - task 12673a56-9f93-8b02-b216-0000000000e1 10896 1726882161.82215: variable 'ansible_search_path' from source: unknown 10896 1726882161.82222: variable 'ansible_search_path' from source: unknown 10896 1726882161.82291: calling self._execute() 10896 1726882161.82348: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.82359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.82373: variable 'omit' from source: magic vars 10896 1726882161.83173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882161.83931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882161.83934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882161.83962: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882161.84002: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882161.84089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882161.84128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882161.84185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882161.84278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882161.84644: Evaluated conditional (not __network_is_ostree is defined): True 10896 1726882161.84648: variable 'omit' from source: magic vars 10896 1726882161.84650: variable 'omit' from source: magic vars 10896 1726882161.85080: variable '__ostree_booted_stat' from source: set_fact 10896 1726882161.85084: variable 'omit' from source: magic vars 10896 1726882161.85087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882161.85361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882161.85364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882161.85470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882161.85473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882161.85476: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882161.85479: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.85481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.85644: Set connection var ansible_connection to ssh 10896 1726882161.85649: Set connection var ansible_timeout to 10 10896 1726882161.85652: Set connection var ansible_shell_type to sh 10896 1726882161.85661: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882161.85665: Set connection var ansible_shell_executable to /bin/sh 10896 1726882161.85706: Set connection var ansible_pipelining to False 10896 1726882161.85736: variable 'ansible_shell_executable' from source: unknown 10896 1726882161.85746: variable 'ansible_connection' from source: unknown 10896 1726882161.85754: variable 'ansible_module_compression' from source: unknown 10896 1726882161.85761: variable 'ansible_shell_type' from source: unknown 10896 1726882161.85767: variable 'ansible_shell_executable' from source: unknown 10896 1726882161.85780: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.85791: variable 'ansible_pipelining' from source: unknown 10896 1726882161.85805: variable 'ansible_timeout' from source: unknown 10896 1726882161.85814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.85935: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882161.85958: variable 'omit' from source: magic vars 10896 1726882161.85968: starting attempt loop 10896 1726882161.85974: running the handler 10896 1726882161.85990: handler run complete 10896 1726882161.86013: attempt loop complete, returning result 10896 1726882161.86022: _execute() done 10896 1726882161.86029: dumping result to json 10896 1726882161.86035: done dumping result, returning 10896 1726882161.86045: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [12673a56-9f93-8b02-b216-0000000000e1] 10896 1726882161.86054: sending task result for task 12673a56-9f93-8b02-b216-0000000000e1 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 10896 1726882161.86297: no more pending results, returning what we have 10896 1726882161.86301: results queue empty 10896 1726882161.86302: checking for any_errors_fatal 10896 1726882161.86306: done checking for any_errors_fatal 10896 1726882161.86307: checking for max_fail_percentage 10896 1726882161.86309: done checking for max_fail_percentage 10896 1726882161.86309: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.86310: done checking to see if all hosts have failed 10896 1726882161.86311: getting the remaining hosts for this loop 10896 1726882161.86312: done getting the remaining hosts for this loop 10896 1726882161.86315: getting the next task for host managed_node2 10896 1726882161.86324: done getting next task for host managed_node2 10896 1726882161.86326: ^ task is: TASK: Fix CentOS6 Base repo 10896 1726882161.86329: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.86332: getting variables 10896 1726882161.86334: in VariableManager get_vars() 10896 1726882161.86363: Calling all_inventory to load vars for managed_node2 10896 1726882161.86365: Calling groups_inventory to load vars for managed_node2 10896 1726882161.86368: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.86378: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.86380: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.86382: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.86568: done sending task result for task 12673a56-9f93-8b02-b216-0000000000e1 10896 1726882161.86576: WORKER PROCESS EXITING 10896 1726882161.86602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.86731: done with get_vars() 10896 1726882161.86738: done getting variables 10896 1726882161.86826: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:29:21 -0400 (0:00:00.055) 0:00:03.435 ****** 10896 1726882161.86846: entering _queue_task() for managed_node2/copy 10896 1726882161.87036: worker is 1 (out of 1 available) 10896 1726882161.87048: exiting _queue_task() for managed_node2/copy 10896 1726882161.87060: done queuing things up, now waiting for results queue to drain 10896 1726882161.87061: waiting for pending results... 10896 1726882161.87195: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 10896 1726882161.87271: in run() - task 12673a56-9f93-8b02-b216-0000000000e3 10896 1726882161.87282: variable 'ansible_search_path' from source: unknown 10896 1726882161.87287: variable 'ansible_search_path' from source: unknown 10896 1726882161.87315: calling self._execute() 10896 1726882161.87371: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.87375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.87383: variable 'omit' from source: magic vars 10896 1726882161.87710: variable 'ansible_distribution' from source: facts 10896 1726882161.87726: Evaluated conditional (ansible_distribution == 'CentOS'): True 10896 1726882161.87809: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.87812: Evaluated conditional (ansible_distribution_major_version == '6'): False 10896 1726882161.87815: when evaluation is False, skipping this task 10896 1726882161.87818: _execute() done 10896 1726882161.87820: dumping result to json 10896 1726882161.87823: done dumping result, returning 10896 1726882161.87832: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [12673a56-9f93-8b02-b216-0000000000e3] 10896 1726882161.87834: sending task result for task 12673a56-9f93-8b02-b216-0000000000e3 10896 1726882161.87918: done sending task result for task 12673a56-9f93-8b02-b216-0000000000e3 10896 1726882161.87921: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 10896 1726882161.87987: no more pending results, returning what we have 10896 1726882161.87989: results queue empty 10896 1726882161.87990: checking for any_errors_fatal 10896 1726882161.87996: done checking for any_errors_fatal 10896 1726882161.87997: checking for max_fail_percentage 10896 1726882161.87998: done checking for max_fail_percentage 10896 1726882161.87999: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.87999: done checking to see if all hosts have failed 10896 1726882161.88000: getting the remaining hosts for this loop 10896 1726882161.88001: done getting the remaining hosts for this loop 10896 1726882161.88004: getting the next task for host managed_node2 10896 1726882161.88008: done getting next task for host managed_node2 10896 1726882161.88010: ^ task is: TASK: Include the task 'enable_epel.yml' 10896 1726882161.88013: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.88016: getting variables 10896 1726882161.88017: in VariableManager get_vars() 10896 1726882161.88041: Calling all_inventory to load vars for managed_node2 10896 1726882161.88043: Calling groups_inventory to load vars for managed_node2 10896 1726882161.88045: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.88051: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.88053: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.88054: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.88170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.88281: done with get_vars() 10896 1726882161.88287: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:29:21 -0400 (0:00:00.015) 0:00:03.450 ****** 10896 1726882161.88352: entering _queue_task() for managed_node2/include_tasks 10896 1726882161.88549: worker is 1 (out of 1 available) 10896 1726882161.88559: exiting _queue_task() for managed_node2/include_tasks 10896 1726882161.88570: done queuing things up, now waiting for results queue to drain 10896 1726882161.88571: waiting for pending results... 10896 1726882161.88764: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 10896 1726882161.88831: in run() - task 12673a56-9f93-8b02-b216-0000000000e4 10896 1726882161.88837: variable 'ansible_search_path' from source: unknown 10896 1726882161.88841: variable 'ansible_search_path' from source: unknown 10896 1726882161.88868: calling self._execute() 10896 1726882161.88922: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.88926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.88938: variable 'omit' from source: magic vars 10896 1726882161.89316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882161.90834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882161.90883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882161.90914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882161.90940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882161.90960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882161.91017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882161.91039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882161.91057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882161.91082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882161.91098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882161.91177: variable '__network_is_ostree' from source: set_fact 10896 1726882161.91190: Evaluated conditional (not __network_is_ostree | d(false)): True 10896 1726882161.91197: _execute() done 10896 1726882161.91200: dumping result to json 10896 1726882161.91202: done dumping result, returning 10896 1726882161.91207: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-8b02-b216-0000000000e4] 10896 1726882161.91212: sending task result for task 12673a56-9f93-8b02-b216-0000000000e4 10896 1726882161.91287: done sending task result for task 12673a56-9f93-8b02-b216-0000000000e4 10896 1726882161.91289: WORKER PROCESS EXITING 10896 1726882161.91321: no more pending results, returning what we have 10896 1726882161.91326: in VariableManager get_vars() 10896 1726882161.91356: Calling all_inventory to load vars for managed_node2 10896 1726882161.91359: Calling groups_inventory to load vars for managed_node2 10896 1726882161.91362: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.91372: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.91374: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.91377: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.91542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.91650: done with get_vars() 10896 1726882161.91655: variable 'ansible_search_path' from source: unknown 10896 1726882161.91656: variable 'ansible_search_path' from source: unknown 10896 1726882161.91679: we have included files to process 10896 1726882161.91680: generating all_blocks data 10896 1726882161.91681: done generating all_blocks data 10896 1726882161.91686: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10896 1726882161.91687: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10896 1726882161.91689: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10896 1726882161.92110: done processing included file 10896 1726882161.92112: iterating over new_blocks loaded from include file 10896 1726882161.92113: in VariableManager get_vars() 10896 1726882161.92121: done with get_vars() 10896 1726882161.92122: filtering new block on tags 10896 1726882161.92136: done filtering new block on tags 10896 1726882161.92138: in VariableManager get_vars() 10896 1726882161.92144: done with get_vars() 10896 1726882161.92145: filtering new block on tags 10896 1726882161.92151: done filtering new block on tags 10896 1726882161.92152: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 10896 1726882161.92156: extending task lists for all hosts with included blocks 10896 1726882161.92216: done extending task lists 10896 1726882161.92217: done processing included files 10896 1726882161.92218: results queue empty 10896 1726882161.92218: checking for any_errors_fatal 10896 1726882161.92220: done checking for any_errors_fatal 10896 1726882161.92221: checking for max_fail_percentage 10896 1726882161.92221: done checking for max_fail_percentage 10896 1726882161.92222: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.92222: done checking to see if all hosts have failed 10896 1726882161.92223: getting the remaining hosts for this loop 10896 1726882161.92223: done getting the remaining hosts for this loop 10896 1726882161.92225: getting the next task for host managed_node2 10896 1726882161.92228: done getting next task for host managed_node2 10896 1726882161.92230: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 10896 1726882161.92232: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.92234: getting variables 10896 1726882161.92235: in VariableManager get_vars() 10896 1726882161.92240: Calling all_inventory to load vars for managed_node2 10896 1726882161.92241: Calling groups_inventory to load vars for managed_node2 10896 1726882161.92243: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.92246: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.92252: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.92254: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.92347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.92458: done with get_vars() 10896 1726882161.92464: done getting variables 10896 1726882161.92510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 10896 1726882161.92643: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:29:21 -0400 (0:00:00.043) 0:00:03.493 ****** 10896 1726882161.92676: entering _queue_task() for managed_node2/command 10896 1726882161.92677: Creating lock for command 10896 1726882161.92858: worker is 1 (out of 1 available) 10896 1726882161.92868: exiting _queue_task() for managed_node2/command 10896 1726882161.92880: done queuing things up, now waiting for results queue to drain 10896 1726882161.92881: waiting for pending results... 10896 1726882161.93018: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 10896 1726882161.93076: in run() - task 12673a56-9f93-8b02-b216-0000000000fe 10896 1726882161.93086: variable 'ansible_search_path' from source: unknown 10896 1726882161.93089: variable 'ansible_search_path' from source: unknown 10896 1726882161.93121: calling self._execute() 10896 1726882161.93172: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.93176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.93184: variable 'omit' from source: magic vars 10896 1726882161.93442: variable 'ansible_distribution' from source: facts 10896 1726882161.93445: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 10896 1726882161.93523: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.93527: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 10896 1726882161.93530: when evaluation is False, skipping this task 10896 1726882161.93532: _execute() done 10896 1726882161.93536: dumping result to json 10896 1726882161.93538: done dumping result, returning 10896 1726882161.93544: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [12673a56-9f93-8b02-b216-0000000000fe] 10896 1726882161.93554: sending task result for task 12673a56-9f93-8b02-b216-0000000000fe 10896 1726882161.93638: done sending task result for task 12673a56-9f93-8b02-b216-0000000000fe 10896 1726882161.93641: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 10896 1726882161.93703: no more pending results, returning what we have 10896 1726882161.93706: results queue empty 10896 1726882161.93707: checking for any_errors_fatal 10896 1726882161.93708: done checking for any_errors_fatal 10896 1726882161.93708: checking for max_fail_percentage 10896 1726882161.93710: done checking for max_fail_percentage 10896 1726882161.93711: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.93711: done checking to see if all hosts have failed 10896 1726882161.93712: getting the remaining hosts for this loop 10896 1726882161.93713: done getting the remaining hosts for this loop 10896 1726882161.93716: getting the next task for host managed_node2 10896 1726882161.93721: done getting next task for host managed_node2 10896 1726882161.93723: ^ task is: TASK: Install yum-utils package 10896 1726882161.93726: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.93728: getting variables 10896 1726882161.93730: in VariableManager get_vars() 10896 1726882161.93749: Calling all_inventory to load vars for managed_node2 10896 1726882161.93751: Calling groups_inventory to load vars for managed_node2 10896 1726882161.93753: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.93759: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.93761: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.93765: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.93883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.93995: done with get_vars() 10896 1726882161.94002: done getting variables 10896 1726882161.94062: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:29:21 -0400 (0:00:00.014) 0:00:03.507 ****** 10896 1726882161.94081: entering _queue_task() for managed_node2/package 10896 1726882161.94082: Creating lock for package 10896 1726882161.94265: worker is 1 (out of 1 available) 10896 1726882161.94276: exiting _queue_task() for managed_node2/package 10896 1726882161.94288: done queuing things up, now waiting for results queue to drain 10896 1726882161.94289: waiting for pending results... 10896 1726882161.94415: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 10896 1726882161.94473: in run() - task 12673a56-9f93-8b02-b216-0000000000ff 10896 1726882161.94483: variable 'ansible_search_path' from source: unknown 10896 1726882161.94486: variable 'ansible_search_path' from source: unknown 10896 1726882161.94514: calling self._execute() 10896 1726882161.94563: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.94566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.94574: variable 'omit' from source: magic vars 10896 1726882161.94818: variable 'ansible_distribution' from source: facts 10896 1726882161.94828: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 10896 1726882161.94912: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.94915: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 10896 1726882161.94918: when evaluation is False, skipping this task 10896 1726882161.94921: _execute() done 10896 1726882161.94924: dumping result to json 10896 1726882161.94926: done dumping result, returning 10896 1726882161.94932: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [12673a56-9f93-8b02-b216-0000000000ff] 10896 1726882161.94937: sending task result for task 12673a56-9f93-8b02-b216-0000000000ff 10896 1726882161.95019: done sending task result for task 12673a56-9f93-8b02-b216-0000000000ff 10896 1726882161.95021: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 10896 1726882161.95062: no more pending results, returning what we have 10896 1726882161.95065: results queue empty 10896 1726882161.95066: checking for any_errors_fatal 10896 1726882161.95070: done checking for any_errors_fatal 10896 1726882161.95070: checking for max_fail_percentage 10896 1726882161.95072: done checking for max_fail_percentage 10896 1726882161.95072: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.95073: done checking to see if all hosts have failed 10896 1726882161.95074: getting the remaining hosts for this loop 10896 1726882161.95075: done getting the remaining hosts for this loop 10896 1726882161.95078: getting the next task for host managed_node2 10896 1726882161.95082: done getting next task for host managed_node2 10896 1726882161.95083: ^ task is: TASK: Enable EPEL 7 10896 1726882161.95087: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.95089: getting variables 10896 1726882161.95090: in VariableManager get_vars() 10896 1726882161.95115: Calling all_inventory to load vars for managed_node2 10896 1726882161.95118: Calling groups_inventory to load vars for managed_node2 10896 1726882161.95120: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.95128: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.95130: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.95133: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.95230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.95343: done with get_vars() 10896 1726882161.95351: done getting variables 10896 1726882161.95386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:29:21 -0400 (0:00:00.013) 0:00:03.521 ****** 10896 1726882161.95408: entering _queue_task() for managed_node2/command 10896 1726882161.95567: worker is 1 (out of 1 available) 10896 1726882161.95577: exiting _queue_task() for managed_node2/command 10896 1726882161.95587: done queuing things up, now waiting for results queue to drain 10896 1726882161.95588: waiting for pending results... 10896 1726882161.95727: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 10896 1726882161.96000: in run() - task 12673a56-9f93-8b02-b216-000000000100 10896 1726882161.96003: variable 'ansible_search_path' from source: unknown 10896 1726882161.96006: variable 'ansible_search_path' from source: unknown 10896 1726882161.96009: calling self._execute() 10896 1726882161.96013: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.96016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.96020: variable 'omit' from source: magic vars 10896 1726882161.96321: variable 'ansible_distribution' from source: facts 10896 1726882161.96336: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 10896 1726882161.96452: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.96462: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 10896 1726882161.96469: when evaluation is False, skipping this task 10896 1726882161.96474: _execute() done 10896 1726882161.96479: dumping result to json 10896 1726882161.96484: done dumping result, returning 10896 1726882161.96492: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [12673a56-9f93-8b02-b216-000000000100] 10896 1726882161.96504: sending task result for task 12673a56-9f93-8b02-b216-000000000100 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 10896 1726882161.96621: no more pending results, returning what we have 10896 1726882161.96624: results queue empty 10896 1726882161.96624: checking for any_errors_fatal 10896 1726882161.96630: done checking for any_errors_fatal 10896 1726882161.96631: checking for max_fail_percentage 10896 1726882161.96633: done checking for max_fail_percentage 10896 1726882161.96633: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.96634: done checking to see if all hosts have failed 10896 1726882161.96635: getting the remaining hosts for this loop 10896 1726882161.96636: done getting the remaining hosts for this loop 10896 1726882161.96639: getting the next task for host managed_node2 10896 1726882161.96645: done getting next task for host managed_node2 10896 1726882161.96647: ^ task is: TASK: Enable EPEL 8 10896 1726882161.96650: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.96653: getting variables 10896 1726882161.96654: in VariableManager get_vars() 10896 1726882161.96682: Calling all_inventory to load vars for managed_node2 10896 1726882161.96686: Calling groups_inventory to load vars for managed_node2 10896 1726882161.96689: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.96701: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.96703: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.96706: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.96872: done sending task result for task 12673a56-9f93-8b02-b216-000000000100 10896 1726882161.96875: WORKER PROCESS EXITING 10896 1726882161.96902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.97075: done with get_vars() 10896 1726882161.97083: done getting variables 10896 1726882161.97134: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:29:21 -0400 (0:00:00.017) 0:00:03.538 ****** 10896 1726882161.97158: entering _queue_task() for managed_node2/command 10896 1726882161.97454: worker is 1 (out of 1 available) 10896 1726882161.97463: exiting _queue_task() for managed_node2/command 10896 1726882161.97471: done queuing things up, now waiting for results queue to drain 10896 1726882161.97472: waiting for pending results... 10896 1726882161.97587: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 10896 1726882161.97678: in run() - task 12673a56-9f93-8b02-b216-000000000101 10896 1726882161.97699: variable 'ansible_search_path' from source: unknown 10896 1726882161.97709: variable 'ansible_search_path' from source: unknown 10896 1726882161.97742: calling self._execute() 10896 1726882161.97865: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.97869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.97871: variable 'omit' from source: magic vars 10896 1726882161.98121: variable 'ansible_distribution' from source: facts 10896 1726882161.98131: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 10896 1726882161.98221: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.98224: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 10896 1726882161.98227: when evaluation is False, skipping this task 10896 1726882161.98230: _execute() done 10896 1726882161.98233: dumping result to json 10896 1726882161.98235: done dumping result, returning 10896 1726882161.98244: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [12673a56-9f93-8b02-b216-000000000101] 10896 1726882161.98254: sending task result for task 12673a56-9f93-8b02-b216-000000000101 10896 1726882161.98328: done sending task result for task 12673a56-9f93-8b02-b216-000000000101 10896 1726882161.98331: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 10896 1726882161.98389: no more pending results, returning what we have 10896 1726882161.98392: results queue empty 10896 1726882161.98395: checking for any_errors_fatal 10896 1726882161.98398: done checking for any_errors_fatal 10896 1726882161.98399: checking for max_fail_percentage 10896 1726882161.98400: done checking for max_fail_percentage 10896 1726882161.98401: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.98402: done checking to see if all hosts have failed 10896 1726882161.98402: getting the remaining hosts for this loop 10896 1726882161.98403: done getting the remaining hosts for this loop 10896 1726882161.98406: getting the next task for host managed_node2 10896 1726882161.98412: done getting next task for host managed_node2 10896 1726882161.98414: ^ task is: TASK: Enable EPEL 6 10896 1726882161.98417: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.98420: getting variables 10896 1726882161.98421: in VariableManager get_vars() 10896 1726882161.98441: Calling all_inventory to load vars for managed_node2 10896 1726882161.98444: Calling groups_inventory to load vars for managed_node2 10896 1726882161.98446: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.98452: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.98453: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.98455: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.98552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882161.98661: done with get_vars() 10896 1726882161.98667: done getting variables 10896 1726882161.98705: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:29:21 -0400 (0:00:00.015) 0:00:03.554 ****** 10896 1726882161.98723: entering _queue_task() for managed_node2/copy 10896 1726882161.98871: worker is 1 (out of 1 available) 10896 1726882161.98881: exiting _queue_task() for managed_node2/copy 10896 1726882161.98891: done queuing things up, now waiting for results queue to drain 10896 1726882161.98892: waiting for pending results... 10896 1726882161.99018: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 10896 1726882161.99076: in run() - task 12673a56-9f93-8b02-b216-000000000103 10896 1726882161.99085: variable 'ansible_search_path' from source: unknown 10896 1726882161.99089: variable 'ansible_search_path' from source: unknown 10896 1726882161.99116: calling self._execute() 10896 1726882161.99166: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882161.99170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882161.99178: variable 'omit' from source: magic vars 10896 1726882161.99448: variable 'ansible_distribution' from source: facts 10896 1726882161.99458: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 10896 1726882161.99534: variable 'ansible_distribution_major_version' from source: facts 10896 1726882161.99538: Evaluated conditional (ansible_distribution_major_version == '6'): False 10896 1726882161.99540: when evaluation is False, skipping this task 10896 1726882161.99543: _execute() done 10896 1726882161.99545: dumping result to json 10896 1726882161.99550: done dumping result, returning 10896 1726882161.99555: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [12673a56-9f93-8b02-b216-000000000103] 10896 1726882161.99559: sending task result for task 12673a56-9f93-8b02-b216-000000000103 10896 1726882161.99638: done sending task result for task 12673a56-9f93-8b02-b216-000000000103 10896 1726882161.99641: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 10896 1726882161.99711: no more pending results, returning what we have 10896 1726882161.99713: results queue empty 10896 1726882161.99714: checking for any_errors_fatal 10896 1726882161.99718: done checking for any_errors_fatal 10896 1726882161.99718: checking for max_fail_percentage 10896 1726882161.99720: done checking for max_fail_percentage 10896 1726882161.99721: checking to see if all hosts have failed and the running result is not ok 10896 1726882161.99721: done checking to see if all hosts have failed 10896 1726882161.99722: getting the remaining hosts for this loop 10896 1726882161.99723: done getting the remaining hosts for this loop 10896 1726882161.99725: getting the next task for host managed_node2 10896 1726882161.99731: done getting next task for host managed_node2 10896 1726882161.99734: ^ task is: TASK: Set network provider to 'nm' 10896 1726882161.99736: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882161.99739: getting variables 10896 1726882161.99740: in VariableManager get_vars() 10896 1726882161.99760: Calling all_inventory to load vars for managed_node2 10896 1726882161.99762: Calling groups_inventory to load vars for managed_node2 10896 1726882161.99764: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882161.99770: Calling all_plugins_play to load vars for managed_node2 10896 1726882161.99771: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882161.99773: Calling groups_plugins_play to load vars for managed_node2 10896 1726882161.99895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.00001: done with get_vars() 10896 1726882162.00008: done getting variables 10896 1726882162.00043: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Friday 20 September 2024 21:29:21 -0400 (0:00:00.013) 0:00:03.567 ****** 10896 1726882162.00058: entering _queue_task() for managed_node2/set_fact 10896 1726882162.00208: worker is 1 (out of 1 available) 10896 1726882162.00219: exiting _queue_task() for managed_node2/set_fact 10896 1726882162.00230: done queuing things up, now waiting for results queue to drain 10896 1726882162.00232: waiting for pending results... 10896 1726882162.00362: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 10896 1726882162.00409: in run() - task 12673a56-9f93-8b02-b216-000000000007 10896 1726882162.00422: variable 'ansible_search_path' from source: unknown 10896 1726882162.00445: calling self._execute() 10896 1726882162.00498: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.00505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.00514: variable 'omit' from source: magic vars 10896 1726882162.00577: variable 'omit' from source: magic vars 10896 1726882162.00600: variable 'omit' from source: magic vars 10896 1726882162.00623: variable 'omit' from source: magic vars 10896 1726882162.00652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882162.00678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882162.00691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882162.00708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882162.00717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882162.00740: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882162.00743: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.00746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.00815: Set connection var ansible_connection to ssh 10896 1726882162.00818: Set connection var ansible_timeout to 10 10896 1726882162.00822: Set connection var ansible_shell_type to sh 10896 1726882162.00829: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882162.00834: Set connection var ansible_shell_executable to /bin/sh 10896 1726882162.00839: Set connection var ansible_pipelining to False 10896 1726882162.00856: variable 'ansible_shell_executable' from source: unknown 10896 1726882162.00859: variable 'ansible_connection' from source: unknown 10896 1726882162.00862: variable 'ansible_module_compression' from source: unknown 10896 1726882162.00864: variable 'ansible_shell_type' from source: unknown 10896 1726882162.00866: variable 'ansible_shell_executable' from source: unknown 10896 1726882162.00868: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.00871: variable 'ansible_pipelining' from source: unknown 10896 1726882162.00873: variable 'ansible_timeout' from source: unknown 10896 1726882162.00879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.00972: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882162.00979: variable 'omit' from source: magic vars 10896 1726882162.00987: starting attempt loop 10896 1726882162.00990: running the handler 10896 1726882162.01001: handler run complete 10896 1726882162.01010: attempt loop complete, returning result 10896 1726882162.01012: _execute() done 10896 1726882162.01015: dumping result to json 10896 1726882162.01017: done dumping result, returning 10896 1726882162.01023: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [12673a56-9f93-8b02-b216-000000000007] 10896 1726882162.01027: sending task result for task 12673a56-9f93-8b02-b216-000000000007 10896 1726882162.01108: done sending task result for task 12673a56-9f93-8b02-b216-000000000007 10896 1726882162.01111: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 10896 1726882162.01156: no more pending results, returning what we have 10896 1726882162.01158: results queue empty 10896 1726882162.01159: checking for any_errors_fatal 10896 1726882162.01162: done checking for any_errors_fatal 10896 1726882162.01163: checking for max_fail_percentage 10896 1726882162.01164: done checking for max_fail_percentage 10896 1726882162.01165: checking to see if all hosts have failed and the running result is not ok 10896 1726882162.01165: done checking to see if all hosts have failed 10896 1726882162.01166: getting the remaining hosts for this loop 10896 1726882162.01167: done getting the remaining hosts for this loop 10896 1726882162.01170: getting the next task for host managed_node2 10896 1726882162.01174: done getting next task for host managed_node2 10896 1726882162.01176: ^ task is: TASK: meta (flush_handlers) 10896 1726882162.01177: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882162.01181: getting variables 10896 1726882162.01182: in VariableManager get_vars() 10896 1726882162.01207: Calling all_inventory to load vars for managed_node2 10896 1726882162.01210: Calling groups_inventory to load vars for managed_node2 10896 1726882162.01213: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882162.01222: Calling all_plugins_play to load vars for managed_node2 10896 1726882162.01224: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882162.01226: Calling groups_plugins_play to load vars for managed_node2 10896 1726882162.01323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.01552: done with get_vars() 10896 1726882162.01558: done getting variables 10896 1726882162.01598: in VariableManager get_vars() 10896 1726882162.01604: Calling all_inventory to load vars for managed_node2 10896 1726882162.01606: Calling groups_inventory to load vars for managed_node2 10896 1726882162.01607: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882162.01610: Calling all_plugins_play to load vars for managed_node2 10896 1726882162.01611: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882162.01613: Calling groups_plugins_play to load vars for managed_node2 10896 1726882162.01688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.01795: done with get_vars() 10896 1726882162.01804: done queuing things up, now waiting for results queue to drain 10896 1726882162.01805: results queue empty 10896 1726882162.01806: checking for any_errors_fatal 10896 1726882162.01807: done checking for any_errors_fatal 10896 1726882162.01808: checking for max_fail_percentage 10896 1726882162.01808: done checking for max_fail_percentage 10896 1726882162.01809: checking to see if all hosts have failed and the running result is not ok 10896 1726882162.01809: done checking to see if all hosts have failed 10896 1726882162.01809: getting the remaining hosts for this loop 10896 1726882162.01810: done getting the remaining hosts for this loop 10896 1726882162.01811: getting the next task for host managed_node2 10896 1726882162.01813: done getting next task for host managed_node2 10896 1726882162.01814: ^ task is: TASK: meta (flush_handlers) 10896 1726882162.01815: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882162.01820: getting variables 10896 1726882162.01821: in VariableManager get_vars() 10896 1726882162.01825: Calling all_inventory to load vars for managed_node2 10896 1726882162.01826: Calling groups_inventory to load vars for managed_node2 10896 1726882162.01828: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882162.01831: Calling all_plugins_play to load vars for managed_node2 10896 1726882162.01832: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882162.01833: Calling groups_plugins_play to load vars for managed_node2 10896 1726882162.01910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.02025: done with get_vars() 10896 1726882162.02029: done getting variables 10896 1726882162.02056: in VariableManager get_vars() 10896 1726882162.02061: Calling all_inventory to load vars for managed_node2 10896 1726882162.02063: Calling groups_inventory to load vars for managed_node2 10896 1726882162.02064: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882162.02067: Calling all_plugins_play to load vars for managed_node2 10896 1726882162.02068: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882162.02069: Calling groups_plugins_play to load vars for managed_node2 10896 1726882162.02146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.02256: done with get_vars() 10896 1726882162.02263: done queuing things up, now waiting for results queue to drain 10896 1726882162.02264: results queue empty 10896 1726882162.02265: checking for any_errors_fatal 10896 1726882162.02265: done checking for any_errors_fatal 10896 1726882162.02266: checking for max_fail_percentage 10896 1726882162.02266: done checking for max_fail_percentage 10896 1726882162.02267: checking to see if all hosts have failed and the running result is not ok 10896 1726882162.02267: done checking to see if all hosts have failed 10896 1726882162.02268: getting the remaining hosts for this loop 10896 1726882162.02268: done getting the remaining hosts for this loop 10896 1726882162.02269: getting the next task for host managed_node2 10896 1726882162.02271: done getting next task for host managed_node2 10896 1726882162.02272: ^ task is: None 10896 1726882162.02273: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882162.02273: done queuing things up, now waiting for results queue to drain 10896 1726882162.02274: results queue empty 10896 1726882162.02274: checking for any_errors_fatal 10896 1726882162.02275: done checking for any_errors_fatal 10896 1726882162.02275: checking for max_fail_percentage 10896 1726882162.02276: done checking for max_fail_percentage 10896 1726882162.02276: checking to see if all hosts have failed and the running result is not ok 10896 1726882162.02276: done checking to see if all hosts have failed 10896 1726882162.02277: getting the next task for host managed_node2 10896 1726882162.02279: done getting next task for host managed_node2 10896 1726882162.02279: ^ task is: None 10896 1726882162.02280: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882162.02316: in VariableManager get_vars() 10896 1726882162.02345: done with get_vars() 10896 1726882162.02351: in VariableManager get_vars() 10896 1726882162.02365: done with get_vars() 10896 1726882162.02368: variable 'omit' from source: magic vars 10896 1726882162.02387: in VariableManager get_vars() 10896 1726882162.02401: done with get_vars() 10896 1726882162.02415: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 10896 1726882162.02805: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 10896 1726882162.02825: getting the remaining hosts for this loop 10896 1726882162.02826: done getting the remaining hosts for this loop 10896 1726882162.02828: getting the next task for host managed_node2 10896 1726882162.02830: done getting next task for host managed_node2 10896 1726882162.02831: ^ task is: TASK: Gathering Facts 10896 1726882162.02832: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882162.02833: getting variables 10896 1726882162.02833: in VariableManager get_vars() 10896 1726882162.02842: Calling all_inventory to load vars for managed_node2 10896 1726882162.02843: Calling groups_inventory to load vars for managed_node2 10896 1726882162.02844: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882162.02848: Calling all_plugins_play to load vars for managed_node2 10896 1726882162.02858: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882162.02860: Calling groups_plugins_play to load vars for managed_node2 10896 1726882162.02952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882162.03056: done with get_vars() 10896 1726882162.03062: done getting variables 10896 1726882162.03087: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Friday 20 September 2024 21:29:22 -0400 (0:00:00.030) 0:00:03.598 ****** 10896 1726882162.03106: entering _queue_task() for managed_node2/gather_facts 10896 1726882162.03259: worker is 1 (out of 1 available) 10896 1726882162.03270: exiting _queue_task() for managed_node2/gather_facts 10896 1726882162.03280: done queuing things up, now waiting for results queue to drain 10896 1726882162.03281: waiting for pending results... 10896 1726882162.03418: running TaskExecutor() for managed_node2/TASK: Gathering Facts 10896 1726882162.03464: in run() - task 12673a56-9f93-8b02-b216-000000000129 10896 1726882162.03474: variable 'ansible_search_path' from source: unknown 10896 1726882162.03503: calling self._execute() 10896 1726882162.03566: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.03570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.03577: variable 'omit' from source: magic vars 10896 1726882162.03971: variable 'ansible_distribution_major_version' from source: facts 10896 1726882162.03975: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882162.03977: variable 'omit' from source: magic vars 10896 1726882162.03980: variable 'omit' from source: magic vars 10896 1726882162.04199: variable 'omit' from source: magic vars 10896 1726882162.04202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882162.04206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882162.04208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882162.04210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882162.04212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882162.04214: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882162.04216: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.04219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.04277: Set connection var ansible_connection to ssh 10896 1726882162.04290: Set connection var ansible_timeout to 10 10896 1726882162.04300: Set connection var ansible_shell_type to sh 10896 1726882162.04313: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882162.04322: Set connection var ansible_shell_executable to /bin/sh 10896 1726882162.04331: Set connection var ansible_pipelining to False 10896 1726882162.04355: variable 'ansible_shell_executable' from source: unknown 10896 1726882162.04362: variable 'ansible_connection' from source: unknown 10896 1726882162.04369: variable 'ansible_module_compression' from source: unknown 10896 1726882162.04376: variable 'ansible_shell_type' from source: unknown 10896 1726882162.04382: variable 'ansible_shell_executable' from source: unknown 10896 1726882162.04389: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882162.04398: variable 'ansible_pipelining' from source: unknown 10896 1726882162.04405: variable 'ansible_timeout' from source: unknown 10896 1726882162.04413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882162.04578: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882162.04596: variable 'omit' from source: magic vars 10896 1726882162.04607: starting attempt loop 10896 1726882162.04631: running the handler 10896 1726882162.04650: variable 'ansible_facts' from source: unknown 10896 1726882162.04671: _low_level_execute_command(): starting 10896 1726882162.04684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882162.05205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882162.05223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882162.05236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882162.05273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882162.05301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882162.05370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882162.07641: stdout chunk (state=3): >>>/root <<< 10896 1726882162.07844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882162.07847: stdout chunk (state=3): >>><<< 10896 1726882162.07849: stderr chunk (state=3): >>><<< 10896 1726882162.07961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882162.07964: _low_level_execute_command(): starting 10896 1726882162.07968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588 `" && echo ansible-tmp-1726882162.0787008-11135-4281751684588="` echo /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588 `" ) && sleep 0' 10896 1726882162.08418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882162.08423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882162.08427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882162.08430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882162.08438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882162.08482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882162.08485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882162.08559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882162.11215: stdout chunk (state=3): >>>ansible-tmp-1726882162.0787008-11135-4281751684588=/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588 <<< 10896 1726882162.11353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882162.11376: stderr chunk (state=3): >>><<< 10896 1726882162.11379: stdout chunk (state=3): >>><<< 10896 1726882162.11395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882162.0787008-11135-4281751684588=/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882162.11426: variable 'ansible_module_compression' from source: unknown 10896 1726882162.11462: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10896 1726882162.11518: variable 'ansible_facts' from source: unknown 10896 1726882162.11646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py 10896 1726882162.11892: Sending initial data 10896 1726882162.11899: Sent initial data (152 bytes) 10896 1726882162.12505: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882162.12508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882162.12551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882162.12569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882162.12584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882162.12685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882162.14897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882162.14994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882162.15058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmphkm_yv8x /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py <<< 10896 1726882162.15069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py" <<< 10896 1726882162.15121: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmphkm_yv8x" to remote "/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py" <<< 10896 1726882162.17162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882162.17165: stdout chunk (state=3): >>><<< 10896 1726882162.17168: stderr chunk (state=3): >>><<< 10896 1726882162.17170: done transferring module to remote 10896 1726882162.17172: _low_level_execute_command(): starting 10896 1726882162.17174: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/ /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py && sleep 0' 10896 1726882162.17827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882162.17843: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882162.17911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882162.17956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882162.17975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882162.18003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882162.18110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882162.20651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882162.20655: stdout chunk (state=3): >>><<< 10896 1726882162.20657: stderr chunk (state=3): >>><<< 10896 1726882162.20701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882162.20710: _low_level_execute_command(): starting 10896 1726882162.20713: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/AnsiballZ_setup.py && sleep 0' 10896 1726882162.21255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882162.21270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882162.21312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882162.21352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882162.21366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882162.21423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882162.21461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882162.21556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882163.00829: stdout chunk (state=3): >>> <<< 10896 1726882163.01002: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_loadavg": {"1m": 0.42138671875, "5m": 0.23095703125, "15m": 0.109375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "22", "epoch": "1726882162", "epoch_int": "1726882162", "date": "2024-09-20", "time": "21:29:22", "iso8601_micro": "2024-09-21T01:29:22.635281Z", "iso8601": "2024-09-21T01:29:22Z", "iso8601_basic": "20240920T212922635281", "iso8601_basic_short": "20240920T212922", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "<<< 10896 1726882163.01021: stdout chunk (state=3): >>>cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 352, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793452032, "block_size": 4096, "block_total": 65519099, "block_available": 63914417, "block_used": 1604682, "inode_total": 131070960, "inode_available": 131029070, "inode_used": 41890, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10896 1726882163.03902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882163.03905: stdout chunk (state=3): >>><<< 10896 1726882163.03907: stderr chunk (state=3): >>><<< 10896 1726882163.03912: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_loadavg": {"1m": 0.42138671875, "5m": 0.23095703125, "15m": 0.109375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "22", "epoch": "1726882162", "epoch_int": "1726882162", "date": "2024-09-20", "time": "21:29:22", "iso8601_micro": "2024-09-21T01:29:22.635281Z", "iso8601": "2024-09-21T01:29:22Z", "iso8601_basic": "20240920T212922635281", "iso8601_basic_short": "20240920T212922", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 352, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793452032, "block_size": 4096, "block_total": 65519099, "block_available": 63914417, "block_used": 1604682, "inode_total": 131070960, "inode_available": 131029070, "inode_used": 41890, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882163.04496: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882163.04549: _low_level_execute_command(): starting 10896 1726882163.04606: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882162.0787008-11135-4281751684588/ > /dev/null 2>&1 && sleep 0' 10896 1726882163.05556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882163.05568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882163.05582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882163.05603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882163.05628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882163.05711: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.05741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882163.05763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.05777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882163.05891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882163.08428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882163.08527: stderr chunk (state=3): >>><<< 10896 1726882163.08530: stdout chunk (state=3): >>><<< 10896 1726882163.08853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882163.08857: handler run complete 10896 1726882163.09180: variable 'ansible_facts' from source: unknown 10896 1726882163.09422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.10343: variable 'ansible_facts' from source: unknown 10896 1726882163.10653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.10835: attempt loop complete, returning result 10896 1726882163.11074: _execute() done 10896 1726882163.11081: dumping result to json 10896 1726882163.11118: done dumping result, returning 10896 1726882163.11184: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-8b02-b216-000000000129] 10896 1726882163.11197: sending task result for task 12673a56-9f93-8b02-b216-000000000129 ok: [managed_node2] 10896 1726882163.13143: no more pending results, returning what we have 10896 1726882163.13146: results queue empty 10896 1726882163.13147: checking for any_errors_fatal 10896 1726882163.13148: done checking for any_errors_fatal 10896 1726882163.13149: checking for max_fail_percentage 10896 1726882163.13151: done checking for max_fail_percentage 10896 1726882163.13151: checking to see if all hosts have failed and the running result is not ok 10896 1726882163.13152: done checking to see if all hosts have failed 10896 1726882163.13153: getting the remaining hosts for this loop 10896 1726882163.13154: done getting the remaining hosts for this loop 10896 1726882163.13158: getting the next task for host managed_node2 10896 1726882163.13163: done getting next task for host managed_node2 10896 1726882163.13165: ^ task is: TASK: meta (flush_handlers) 10896 1726882163.13167: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882163.13170: getting variables 10896 1726882163.13171: in VariableManager get_vars() 10896 1726882163.13300: done sending task result for task 12673a56-9f93-8b02-b216-000000000129 10896 1726882163.13303: WORKER PROCESS EXITING 10896 1726882163.13309: Calling all_inventory to load vars for managed_node2 10896 1726882163.13311: Calling groups_inventory to load vars for managed_node2 10896 1726882163.13314: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882163.13323: Calling all_plugins_play to load vars for managed_node2 10896 1726882163.13326: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882163.13329: Calling groups_plugins_play to load vars for managed_node2 10896 1726882163.13881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.14474: done with get_vars() 10896 1726882163.14484: done getting variables 10896 1726882163.14552: in VariableManager get_vars() 10896 1726882163.14566: Calling all_inventory to load vars for managed_node2 10896 1726882163.14569: Calling groups_inventory to load vars for managed_node2 10896 1726882163.14571: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882163.14575: Calling all_plugins_play to load vars for managed_node2 10896 1726882163.14578: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882163.14580: Calling groups_plugins_play to load vars for managed_node2 10896 1726882163.15132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.15715: done with get_vars() 10896 1726882163.15728: done queuing things up, now waiting for results queue to drain 10896 1726882163.15730: results queue empty 10896 1726882163.15731: checking for any_errors_fatal 10896 1726882163.15734: done checking for any_errors_fatal 10896 1726882163.15740: checking for max_fail_percentage 10896 1726882163.15741: done checking for max_fail_percentage 10896 1726882163.15742: checking to see if all hosts have failed and the running result is not ok 10896 1726882163.15742: done checking to see if all hosts have failed 10896 1726882163.15743: getting the remaining hosts for this loop 10896 1726882163.15744: done getting the remaining hosts for this loop 10896 1726882163.15746: getting the next task for host managed_node2 10896 1726882163.15750: done getting next task for host managed_node2 10896 1726882163.15752: ^ task is: TASK: INIT Prepare setup 10896 1726882163.15753: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882163.15755: getting variables 10896 1726882163.15756: in VariableManager get_vars() 10896 1726882163.15769: Calling all_inventory to load vars for managed_node2 10896 1726882163.15771: Calling groups_inventory to load vars for managed_node2 10896 1726882163.15773: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882163.15777: Calling all_plugins_play to load vars for managed_node2 10896 1726882163.15779: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882163.15781: Calling groups_plugins_play to load vars for managed_node2 10896 1726882163.16315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.16807: done with get_vars() 10896 1726882163.16815: done getting variables 10896 1726882163.16889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Friday 20 September 2024 21:29:23 -0400 (0:00:01.139) 0:00:04.737 ****** 10896 1726882163.17026: entering _queue_task() for managed_node2/debug 10896 1726882163.17028: Creating lock for debug 10896 1726882163.17769: worker is 1 (out of 1 available) 10896 1726882163.17779: exiting _queue_task() for managed_node2/debug 10896 1726882163.17789: done queuing things up, now waiting for results queue to drain 10896 1726882163.17790: waiting for pending results... 10896 1726882163.18240: running TaskExecutor() for managed_node2/TASK: INIT Prepare setup 10896 1726882163.18536: in run() - task 12673a56-9f93-8b02-b216-00000000000b 10896 1726882163.18554: variable 'ansible_search_path' from source: unknown 10896 1726882163.18755: calling self._execute() 10896 1726882163.18889: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.19362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.19366: variable 'omit' from source: magic vars 10896 1726882163.19884: variable 'ansible_distribution_major_version' from source: facts 10896 1726882163.19931: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882163.19943: variable 'omit' from source: magic vars 10896 1726882163.20046: variable 'omit' from source: magic vars 10896 1726882163.20084: variable 'omit' from source: magic vars 10896 1726882163.20126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882163.20232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882163.20261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882163.20373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882163.20391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882163.20428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882163.20438: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.20446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.20662: Set connection var ansible_connection to ssh 10896 1726882163.20677: Set connection var ansible_timeout to 10 10896 1726882163.20688: Set connection var ansible_shell_type to sh 10896 1726882163.21001: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882163.21004: Set connection var ansible_shell_executable to /bin/sh 10896 1726882163.21007: Set connection var ansible_pipelining to False 10896 1726882163.21009: variable 'ansible_shell_executable' from source: unknown 10896 1726882163.21012: variable 'ansible_connection' from source: unknown 10896 1726882163.21014: variable 'ansible_module_compression' from source: unknown 10896 1726882163.21016: variable 'ansible_shell_type' from source: unknown 10896 1726882163.21018: variable 'ansible_shell_executable' from source: unknown 10896 1726882163.21020: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.21022: variable 'ansible_pipelining' from source: unknown 10896 1726882163.21023: variable 'ansible_timeout' from source: unknown 10896 1726882163.21025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.21137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882163.21300: variable 'omit' from source: magic vars 10896 1726882163.21304: starting attempt loop 10896 1726882163.21306: running the handler 10896 1726882163.21308: handler run complete 10896 1726882163.21404: attempt loop complete, returning result 10896 1726882163.21412: _execute() done 10896 1726882163.21419: dumping result to json 10896 1726882163.21426: done dumping result, returning 10896 1726882163.21899: done running TaskExecutor() for managed_node2/TASK: INIT Prepare setup [12673a56-9f93-8b02-b216-00000000000b] 10896 1726882163.21902: sending task result for task 12673a56-9f93-8b02-b216-00000000000b 10896 1726882163.21965: done sending task result for task 12673a56-9f93-8b02-b216-00000000000b 10896 1726882163.21968: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 10896 1726882163.22013: no more pending results, returning what we have 10896 1726882163.22016: results queue empty 10896 1726882163.22017: checking for any_errors_fatal 10896 1726882163.22018: done checking for any_errors_fatal 10896 1726882163.22019: checking for max_fail_percentage 10896 1726882163.22021: done checking for max_fail_percentage 10896 1726882163.22022: checking to see if all hosts have failed and the running result is not ok 10896 1726882163.22023: done checking to see if all hosts have failed 10896 1726882163.22024: getting the remaining hosts for this loop 10896 1726882163.22025: done getting the remaining hosts for this loop 10896 1726882163.22028: getting the next task for host managed_node2 10896 1726882163.22041: done getting next task for host managed_node2 10896 1726882163.22045: ^ task is: TASK: Install dnsmasq 10896 1726882163.22047: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882163.22051: getting variables 10896 1726882163.22052: in VariableManager get_vars() 10896 1726882163.22087: Calling all_inventory to load vars for managed_node2 10896 1726882163.22090: Calling groups_inventory to load vars for managed_node2 10896 1726882163.22097: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882163.22106: Calling all_plugins_play to load vars for managed_node2 10896 1726882163.22109: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882163.22112: Calling groups_plugins_play to load vars for managed_node2 10896 1726882163.22558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882163.22989: done with get_vars() 10896 1726882163.23002: done getting variables 10896 1726882163.23148: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:29:23 -0400 (0:00:00.061) 0:00:04.798 ****** 10896 1726882163.23177: entering _queue_task() for managed_node2/package 10896 1726882163.23711: worker is 1 (out of 1 available) 10896 1726882163.23721: exiting _queue_task() for managed_node2/package 10896 1726882163.23732: done queuing things up, now waiting for results queue to drain 10896 1726882163.23733: waiting for pending results... 10896 1726882163.24089: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 10896 1726882163.24400: in run() - task 12673a56-9f93-8b02-b216-00000000000f 10896 1726882163.24405: variable 'ansible_search_path' from source: unknown 10896 1726882163.24410: variable 'ansible_search_path' from source: unknown 10896 1726882163.24565: calling self._execute() 10896 1726882163.24645: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.24680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.24698: variable 'omit' from source: magic vars 10896 1726882163.25531: variable 'ansible_distribution_major_version' from source: facts 10896 1726882163.25534: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882163.25537: variable 'omit' from source: magic vars 10896 1726882163.25614: variable 'omit' from source: magic vars 10896 1726882163.26041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882163.30337: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882163.30484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882163.30543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882163.30647: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882163.30800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882163.31006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882163.31118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882163.31134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882163.31265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882163.31289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882163.31537: variable '__network_is_ostree' from source: set_fact 10896 1726882163.31637: variable 'omit' from source: magic vars 10896 1726882163.31641: variable 'omit' from source: magic vars 10896 1726882163.31856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882163.31860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882163.31862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882163.31865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882163.31867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882163.32004: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882163.32014: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.32024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.32297: Set connection var ansible_connection to ssh 10896 1726882163.32300: Set connection var ansible_timeout to 10 10896 1726882163.32303: Set connection var ansible_shell_type to sh 10896 1726882163.32305: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882163.32307: Set connection var ansible_shell_executable to /bin/sh 10896 1726882163.32309: Set connection var ansible_pipelining to False 10896 1726882163.32423: variable 'ansible_shell_executable' from source: unknown 10896 1726882163.32436: variable 'ansible_connection' from source: unknown 10896 1726882163.32443: variable 'ansible_module_compression' from source: unknown 10896 1726882163.32449: variable 'ansible_shell_type' from source: unknown 10896 1726882163.32455: variable 'ansible_shell_executable' from source: unknown 10896 1726882163.32461: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882163.32467: variable 'ansible_pipelining' from source: unknown 10896 1726882163.32473: variable 'ansible_timeout' from source: unknown 10896 1726882163.32482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882163.32759: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882163.32762: variable 'omit' from source: magic vars 10896 1726882163.32764: starting attempt loop 10896 1726882163.32769: running the handler 10896 1726882163.32780: variable 'ansible_facts' from source: unknown 10896 1726882163.32948: variable 'ansible_facts' from source: unknown 10896 1726882163.32951: _low_level_execute_command(): starting 10896 1726882163.32954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882163.34514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.34702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882163.34726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.34920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882163.35160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882163.37549: stdout chunk (state=3): >>>/root <<< 10896 1726882163.37613: stdout chunk (state=3): >>><<< 10896 1726882163.37706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882163.37718: stderr chunk (state=3): >>><<< 10896 1726882163.37752: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882163.37780: _low_level_execute_command(): starting 10896 1726882163.37797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039 `" && echo ansible-tmp-1726882163.3776805-11192-181000927356039="` echo /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039 `" ) && sleep 0' 10896 1726882163.39111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882163.39290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.39356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882163.39359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.39407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882163.39457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10896 1726882163.42230: stdout chunk (state=3): >>>ansible-tmp-1726882163.3776805-11192-181000927356039=/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039 <<< 10896 1726882163.42265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882163.42424: stderr chunk (state=3): >>><<< 10896 1726882163.42447: stdout chunk (state=3): >>><<< 10896 1726882163.42473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882163.3776805-11192-181000927356039=/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10896 1726882163.42533: variable 'ansible_module_compression' from source: unknown 10896 1726882163.42773: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 10896 1726882163.42777: ANSIBALLZ: Acquiring lock 10896 1726882163.42779: ANSIBALLZ: Lock acquired: 139646160836496 10896 1726882163.42781: ANSIBALLZ: Creating module 10896 1726882163.85973: ANSIBALLZ: Writing module into payload 10896 1726882163.86350: ANSIBALLZ: Writing module 10896 1726882163.86377: ANSIBALLZ: Renaming module 10896 1726882163.86398: ANSIBALLZ: Done creating module 10896 1726882163.86430: variable 'ansible_facts' from source: unknown 10896 1726882163.86546: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py 10896 1726882163.86775: Sending initial data 10896 1726882163.86778: Sent initial data (152 bytes) 10896 1726882163.87712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882163.87737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882163.87824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.87985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.88004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882163.88097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882163.90214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882163.90346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882163.90614: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpn1pci8_g /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py <<< 10896 1726882163.90630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py" <<< 10896 1726882163.90686: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpn1pci8_g" to remote "/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py" <<< 10896 1726882163.91784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882163.91931: stderr chunk (state=3): >>><<< 10896 1726882163.91935: stdout chunk (state=3): >>><<< 10896 1726882163.91937: done transferring module to remote 10896 1726882163.91940: _low_level_execute_command(): starting 10896 1726882163.91942: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/ /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py && sleep 0' 10896 1726882163.92485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882163.92504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882163.92518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882163.92545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882163.92609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.92661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882163.92675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.92720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882163.92874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882163.95320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882163.95324: stdout chunk (state=3): >>><<< 10896 1726882163.95327: stderr chunk (state=3): >>><<< 10896 1726882163.95516: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882163.95527: _low_level_execute_command(): starting 10896 1726882163.95530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/AnsiballZ_dnf.py && sleep 0' 10896 1726882163.96673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882163.96677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.96680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882163.96682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882163.96684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882163.96949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882163.97418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.41797: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10896 1726882165.47490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882165.47535: stderr chunk (state=3): >>><<< 10896 1726882165.47538: stdout chunk (state=3): >>><<< 10896 1726882165.47560: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882165.47591: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882165.47604: _low_level_execute_command(): starting 10896 1726882165.47611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882163.3776805-11192-181000927356039/ > /dev/null 2>&1 && sleep 0' 10896 1726882165.48100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882165.48104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.48106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882165.48108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882165.48110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.48158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882165.48163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882165.48166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.48250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.50496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882165.50524: stderr chunk (state=3): >>><<< 10896 1726882165.50527: stdout chunk (state=3): >>><<< 10896 1726882165.50540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882165.50551: handler run complete 10896 1726882165.50702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882165.50863: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882165.50889: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882165.50935: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882165.50962: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882165.51014: variable '__install_status' from source: unknown 10896 1726882165.51040: Evaluated conditional (__install_status is success): True 10896 1726882165.51054: attempt loop complete, returning result 10896 1726882165.51061: _execute() done 10896 1726882165.51065: dumping result to json 10896 1726882165.51092: done dumping result, returning 10896 1726882165.51097: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [12673a56-9f93-8b02-b216-00000000000f] 10896 1726882165.51099: sending task result for task 12673a56-9f93-8b02-b216-00000000000f 10896 1726882165.51208: done sending task result for task 12673a56-9f93-8b02-b216-00000000000f 10896 1726882165.51212: WORKER PROCESS EXITING changed: [managed_node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 10896 1726882165.51309: no more pending results, returning what we have 10896 1726882165.51312: results queue empty 10896 1726882165.51313: checking for any_errors_fatal 10896 1726882165.51318: done checking for any_errors_fatal 10896 1726882165.51319: checking for max_fail_percentage 10896 1726882165.51320: done checking for max_fail_percentage 10896 1726882165.51321: checking to see if all hosts have failed and the running result is not ok 10896 1726882165.51322: done checking to see if all hosts have failed 10896 1726882165.51323: getting the remaining hosts for this loop 10896 1726882165.51324: done getting the remaining hosts for this loop 10896 1726882165.51327: getting the next task for host managed_node2 10896 1726882165.51332: done getting next task for host managed_node2 10896 1726882165.51335: ^ task is: TASK: Install pgrep, sysctl 10896 1726882165.51338: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882165.51340: getting variables 10896 1726882165.51341: in VariableManager get_vars() 10896 1726882165.51382: Calling all_inventory to load vars for managed_node2 10896 1726882165.51385: Calling groups_inventory to load vars for managed_node2 10896 1726882165.51387: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882165.51431: Calling all_plugins_play to load vars for managed_node2 10896 1726882165.51435: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882165.51438: Calling groups_plugins_play to load vars for managed_node2 10896 1726882165.51605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882165.51759: done with get_vars() 10896 1726882165.51767: done getting variables 10896 1726882165.51811: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:29:25 -0400 (0:00:02.286) 0:00:07.085 ****** 10896 1726882165.51833: entering _queue_task() for managed_node2/package 10896 1726882165.52094: worker is 1 (out of 1 available) 10896 1726882165.52107: exiting _queue_task() for managed_node2/package 10896 1726882165.52119: done queuing things up, now waiting for results queue to drain 10896 1726882165.52120: waiting for pending results... 10896 1726882165.52337: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 10896 1726882165.52405: in run() - task 12673a56-9f93-8b02-b216-000000000010 10896 1726882165.52418: variable 'ansible_search_path' from source: unknown 10896 1726882165.52422: variable 'ansible_search_path' from source: unknown 10896 1726882165.52452: calling self._execute() 10896 1726882165.52512: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882165.52516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882165.52524: variable 'omit' from source: magic vars 10896 1726882165.52785: variable 'ansible_distribution_major_version' from source: facts 10896 1726882165.52795: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882165.52874: variable 'ansible_os_family' from source: facts 10896 1726882165.52878: Evaluated conditional (ansible_os_family == 'RedHat'): True 10896 1726882165.52997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882165.53230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882165.53260: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882165.53284: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882165.53315: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882165.53366: variable 'ansible_distribution_major_version' from source: facts 10896 1726882165.53375: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 10896 1726882165.53378: when evaluation is False, skipping this task 10896 1726882165.53380: _execute() done 10896 1726882165.53383: dumping result to json 10896 1726882165.53385: done dumping result, returning 10896 1726882165.53391: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [12673a56-9f93-8b02-b216-000000000010] 10896 1726882165.53402: sending task result for task 12673a56-9f93-8b02-b216-000000000010 10896 1726882165.53481: done sending task result for task 12673a56-9f93-8b02-b216-000000000010 10896 1726882165.53484: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 10896 1726882165.53529: no more pending results, returning what we have 10896 1726882165.53532: results queue empty 10896 1726882165.53533: checking for any_errors_fatal 10896 1726882165.53539: done checking for any_errors_fatal 10896 1726882165.53539: checking for max_fail_percentage 10896 1726882165.53540: done checking for max_fail_percentage 10896 1726882165.53541: checking to see if all hosts have failed and the running result is not ok 10896 1726882165.53542: done checking to see if all hosts have failed 10896 1726882165.53543: getting the remaining hosts for this loop 10896 1726882165.53544: done getting the remaining hosts for this loop 10896 1726882165.53547: getting the next task for host managed_node2 10896 1726882165.53552: done getting next task for host managed_node2 10896 1726882165.53554: ^ task is: TASK: Install pgrep, sysctl 10896 1726882165.53557: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882165.53559: getting variables 10896 1726882165.53560: in VariableManager get_vars() 10896 1726882165.53590: Calling all_inventory to load vars for managed_node2 10896 1726882165.53594: Calling groups_inventory to load vars for managed_node2 10896 1726882165.53597: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882165.53605: Calling all_plugins_play to load vars for managed_node2 10896 1726882165.53608: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882165.53610: Calling groups_plugins_play to load vars for managed_node2 10896 1726882165.53749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882165.53864: done with get_vars() 10896 1726882165.53871: done getting variables 10896 1726882165.53910: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:29:25 -0400 (0:00:00.020) 0:00:07.106 ****** 10896 1726882165.53929: entering _queue_task() for managed_node2/package 10896 1726882165.54107: worker is 1 (out of 1 available) 10896 1726882165.54120: exiting _queue_task() for managed_node2/package 10896 1726882165.54132: done queuing things up, now waiting for results queue to drain 10896 1726882165.54133: waiting for pending results... 10896 1726882165.54289: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 10896 1726882165.54358: in run() - task 12673a56-9f93-8b02-b216-000000000011 10896 1726882165.54367: variable 'ansible_search_path' from source: unknown 10896 1726882165.54371: variable 'ansible_search_path' from source: unknown 10896 1726882165.54399: calling self._execute() 10896 1726882165.54461: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882165.54465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882165.54473: variable 'omit' from source: magic vars 10896 1726882165.54938: variable 'ansible_distribution_major_version' from source: facts 10896 1726882165.54943: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882165.55043: variable 'ansible_os_family' from source: facts 10896 1726882165.55058: Evaluated conditional (ansible_os_family == 'RedHat'): True 10896 1726882165.55339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882165.55553: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882165.55618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882165.55637: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882165.55663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882165.55718: variable 'ansible_distribution_major_version' from source: facts 10896 1726882165.55756: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 10896 1726882165.55759: variable 'omit' from source: magic vars 10896 1726882165.55776: variable 'omit' from source: magic vars 10896 1726882165.55875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882165.58001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882165.58010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882165.58026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882165.58067: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882165.58109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882165.58214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882165.58237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882165.58254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882165.58323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882165.58367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882165.58522: variable '__network_is_ostree' from source: set_fact 10896 1726882165.58535: variable 'omit' from source: magic vars 10896 1726882165.58663: variable 'omit' from source: magic vars 10896 1726882165.58666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882165.58669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882165.58671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882165.58673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882165.58680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882165.58705: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882165.58708: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882165.58710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882165.58791: Set connection var ansible_connection to ssh 10896 1726882165.58799: Set connection var ansible_timeout to 10 10896 1726882165.58802: Set connection var ansible_shell_type to sh 10896 1726882165.58808: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882165.58813: Set connection var ansible_shell_executable to /bin/sh 10896 1726882165.58817: Set connection var ansible_pipelining to False 10896 1726882165.58839: variable 'ansible_shell_executable' from source: unknown 10896 1726882165.58842: variable 'ansible_connection' from source: unknown 10896 1726882165.58852: variable 'ansible_module_compression' from source: unknown 10896 1726882165.58855: variable 'ansible_shell_type' from source: unknown 10896 1726882165.58857: variable 'ansible_shell_executable' from source: unknown 10896 1726882165.58859: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882165.58861: variable 'ansible_pipelining' from source: unknown 10896 1726882165.58865: variable 'ansible_timeout' from source: unknown 10896 1726882165.58867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882165.58931: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882165.58939: variable 'omit' from source: magic vars 10896 1726882165.58947: starting attempt loop 10896 1726882165.58951: running the handler 10896 1726882165.58961: variable 'ansible_facts' from source: unknown 10896 1726882165.58964: variable 'ansible_facts' from source: unknown 10896 1726882165.58987: _low_level_execute_command(): starting 10896 1726882165.58996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882165.59732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882165.59943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.60060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.61635: stdout chunk (state=3): >>>/root <<< 10896 1726882165.61731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882165.61762: stderr chunk (state=3): >>><<< 10896 1726882165.61770: stdout chunk (state=3): >>><<< 10896 1726882165.61790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882165.61805: _low_level_execute_command(): starting 10896 1726882165.61811: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904 `" && echo ansible-tmp-1726882165.617903-11293-48364913598904="` echo /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904 `" ) && sleep 0' 10896 1726882165.62247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882165.62250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882165.62253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.62255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882165.62258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.62313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882165.62319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.62377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.64340: stdout chunk (state=3): >>>ansible-tmp-1726882165.617903-11293-48364913598904=/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904 <<< 10896 1726882165.64426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882165.64430: stdout chunk (state=3): >>><<< 10896 1726882165.64432: stderr chunk (state=3): >>><<< 10896 1726882165.64605: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882165.617903-11293-48364913598904=/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882165.64609: variable 'ansible_module_compression' from source: unknown 10896 1726882165.64611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10896 1726882165.64614: variable 'ansible_facts' from source: unknown 10896 1726882165.64798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py 10896 1726882165.65004: Sending initial data 10896 1726882165.65007: Sent initial data (150 bytes) 10896 1726882165.65723: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882165.65739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882165.65753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882165.65773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882165.65802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882165.65833: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882165.65854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.65942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.65987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882165.66019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882165.66055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.66186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.67659: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10896 1726882165.67688: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882165.67736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882165.67798: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpa1v1g7w0 /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py <<< 10896 1726882165.67806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py" <<< 10896 1726882165.67855: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpa1v1g7w0" to remote "/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py" <<< 10896 1726882165.67861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py" <<< 10896 1726882165.68986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882165.69011: stderr chunk (state=3): >>><<< 10896 1726882165.69126: stdout chunk (state=3): >>><<< 10896 1726882165.69129: done transferring module to remote 10896 1726882165.69131: _low_level_execute_command(): starting 10896 1726882165.69133: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/ /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py && sleep 0' 10896 1726882165.69877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882165.69892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882165.69909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882165.69926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882165.69985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.70040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882165.70062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882165.70092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.70179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882165.71946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882165.71949: stdout chunk (state=3): >>><<< 10896 1726882165.71951: stderr chunk (state=3): >>><<< 10896 1726882165.72052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882165.72058: _low_level_execute_command(): starting 10896 1726882165.72064: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/AnsiballZ_dnf.py && sleep 0' 10896 1726882165.72729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882165.72769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882165.72784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882165.72807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882165.72924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.13252: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10896 1726882166.17281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882166.17312: stderr chunk (state=3): >>><<< 10896 1726882166.17316: stdout chunk (state=3): >>><<< 10896 1726882166.17330: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882166.17368: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882166.17374: _low_level_execute_command(): starting 10896 1726882166.17379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882165.617903-11293-48364913598904/ > /dev/null 2>&1 && sleep 0' 10896 1726882166.17856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.17859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882166.17862: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.17864: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.17866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882166.17867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.17939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882166.17946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882166.18024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.19870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882166.19902: stderr chunk (state=3): >>><<< 10896 1726882166.19905: stdout chunk (state=3): >>><<< 10896 1726882166.19911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882166.19917: handler run complete 10896 1726882166.19940: attempt loop complete, returning result 10896 1726882166.19943: _execute() done 10896 1726882166.19945: dumping result to json 10896 1726882166.19950: done dumping result, returning 10896 1726882166.19957: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [12673a56-9f93-8b02-b216-000000000011] 10896 1726882166.19967: sending task result for task 12673a56-9f93-8b02-b216-000000000011 10896 1726882166.20106: done sending task result for task 12673a56-9f93-8b02-b216-000000000011 10896 1726882166.20110: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10896 1726882166.20185: no more pending results, returning what we have 10896 1726882166.20188: results queue empty 10896 1726882166.20189: checking for any_errors_fatal 10896 1726882166.20200: done checking for any_errors_fatal 10896 1726882166.20201: checking for max_fail_percentage 10896 1726882166.20202: done checking for max_fail_percentage 10896 1726882166.20203: checking to see if all hosts have failed and the running result is not ok 10896 1726882166.20204: done checking to see if all hosts have failed 10896 1726882166.20205: getting the remaining hosts for this loop 10896 1726882166.20206: done getting the remaining hosts for this loop 10896 1726882166.20212: getting the next task for host managed_node2 10896 1726882166.20219: done getting next task for host managed_node2 10896 1726882166.20222: ^ task is: TASK: Create test interfaces 10896 1726882166.20229: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882166.20232: getting variables 10896 1726882166.20234: in VariableManager get_vars() 10896 1726882166.20283: Calling all_inventory to load vars for managed_node2 10896 1726882166.20286: Calling groups_inventory to load vars for managed_node2 10896 1726882166.20288: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882166.20323: Calling all_plugins_play to load vars for managed_node2 10896 1726882166.20326: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882166.20330: Calling groups_plugins_play to load vars for managed_node2 10896 1726882166.20465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882166.20586: done with get_vars() 10896 1726882166.20597: done getting variables 10896 1726882166.20683: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:29:26 -0400 (0:00:00.667) 0:00:07.774 ****** 10896 1726882166.20707: entering _queue_task() for managed_node2/shell 10896 1726882166.20708: Creating lock for shell 10896 1726882166.20934: worker is 1 (out of 1 available) 10896 1726882166.20951: exiting _queue_task() for managed_node2/shell 10896 1726882166.20962: done queuing things up, now waiting for results queue to drain 10896 1726882166.20964: waiting for pending results... 10896 1726882166.21118: running TaskExecutor() for managed_node2/TASK: Create test interfaces 10896 1726882166.21192: in run() - task 12673a56-9f93-8b02-b216-000000000012 10896 1726882166.21204: variable 'ansible_search_path' from source: unknown 10896 1726882166.21207: variable 'ansible_search_path' from source: unknown 10896 1726882166.21235: calling self._execute() 10896 1726882166.21291: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882166.21299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882166.21307: variable 'omit' from source: magic vars 10896 1726882166.21897: variable 'ansible_distribution_major_version' from source: facts 10896 1726882166.21900: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882166.21904: variable 'omit' from source: magic vars 10896 1726882166.21908: variable 'omit' from source: magic vars 10896 1726882166.22198: variable 'dhcp_interface1' from source: play vars 10896 1726882166.22202: variable 'dhcp_interface2' from source: play vars 10896 1726882166.22231: variable 'omit' from source: magic vars 10896 1726882166.22265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882166.22290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882166.22309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882166.22324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882166.22332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882166.22355: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882166.22358: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882166.22360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882166.22464: Set connection var ansible_connection to ssh 10896 1726882166.22700: Set connection var ansible_timeout to 10 10896 1726882166.22703: Set connection var ansible_shell_type to sh 10896 1726882166.22706: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882166.22711: Set connection var ansible_shell_executable to /bin/sh 10896 1726882166.22715: Set connection var ansible_pipelining to False 10896 1726882166.22718: variable 'ansible_shell_executable' from source: unknown 10896 1726882166.22720: variable 'ansible_connection' from source: unknown 10896 1726882166.22723: variable 'ansible_module_compression' from source: unknown 10896 1726882166.22725: variable 'ansible_shell_type' from source: unknown 10896 1726882166.22729: variable 'ansible_shell_executable' from source: unknown 10896 1726882166.22732: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882166.22734: variable 'ansible_pipelining' from source: unknown 10896 1726882166.22736: variable 'ansible_timeout' from source: unknown 10896 1726882166.22738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882166.22815: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882166.22836: variable 'omit' from source: magic vars 10896 1726882166.22853: starting attempt loop 10896 1726882166.22862: running the handler 10896 1726882166.22881: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882166.22913: _low_level_execute_command(): starting 10896 1726882166.22931: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882166.23638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882166.23774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882166.23806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882166.23896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.25497: stdout chunk (state=3): >>>/root <<< 10896 1726882166.25573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882166.25602: stderr chunk (state=3): >>><<< 10896 1726882166.25605: stdout chunk (state=3): >>><<< 10896 1726882166.25637: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882166.25640: _low_level_execute_command(): starting 10896 1726882166.25644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646 `" && echo ansible-tmp-1726882166.2562199-11317-272562839633646="` echo /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646 `" ) && sleep 0' 10896 1726882166.26076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882166.26080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882166.26090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882166.26096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882166.26099: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.26146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882166.26150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882166.26206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.28044: stdout chunk (state=3): >>>ansible-tmp-1726882166.2562199-11317-272562839633646=/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646 <<< 10896 1726882166.28151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882166.28174: stderr chunk (state=3): >>><<< 10896 1726882166.28180: stdout chunk (state=3): >>><<< 10896 1726882166.28219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882166.2562199-11317-272562839633646=/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882166.28247: variable 'ansible_module_compression' from source: unknown 10896 1726882166.28398: ANSIBALLZ: Using generic lock for ansible.legacy.command 10896 1726882166.28402: ANSIBALLZ: Acquiring lock 10896 1726882166.28403: ANSIBALLZ: Lock acquired: 139646160836496 10896 1726882166.28405: ANSIBALLZ: Creating module 10896 1726882166.50685: ANSIBALLZ: Writing module into payload 10896 1726882166.50782: ANSIBALLZ: Writing module 10896 1726882166.51099: ANSIBALLZ: Renaming module 10896 1726882166.51103: ANSIBALLZ: Done creating module 10896 1726882166.51105: variable 'ansible_facts' from source: unknown 10896 1726882166.51135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py 10896 1726882166.51619: Sending initial data 10896 1726882166.51623: Sent initial data (156 bytes) 10896 1726882166.52842: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.52846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882166.52850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.52852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882166.52855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.53109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882166.53316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882166.53524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.55064: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10896 1726882166.55085: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882166.55163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882166.55253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp8baq7_zw /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py <<< 10896 1726882166.55303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py" <<< 10896 1726882166.55336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp8baq7_zw" to remote "/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py" <<< 10896 1726882166.56818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882166.56829: stdout chunk (state=3): >>><<< 10896 1726882166.56840: stderr chunk (state=3): >>><<< 10896 1726882166.57082: done transferring module to remote 10896 1726882166.57086: _low_level_execute_command(): starting 10896 1726882166.57088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/ /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py && sleep 0' 10896 1726882166.58235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.58249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882166.58264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.58519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882166.58623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882166.60404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882166.60479: stderr chunk (state=3): >>><<< 10896 1726882166.60489: stdout chunk (state=3): >>><<< 10896 1726882166.60517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882166.60530: _low_level_execute_command(): starting 10896 1726882166.60541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/AnsiballZ_command.py && sleep 0' 10896 1726882166.61843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.61846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882166.61848: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.61850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882166.61852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882166.61854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882166.62021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882166.62123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882166.62241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882167.99171: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6947 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6947 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.co<<< 10896 1726882167.99208: stdout chunk (state=3): >>>m/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:26.769616", "end": "2024-09-20 21:29:27.989885", "delta": "0:00:01.220269", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882168.01027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882168.01031: stdout chunk (state=3): >>><<< 10896 1726882168.01034: stderr chunk (state=3): >>><<< 10896 1726882168.01042: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6947 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6947 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:26.769616", "end": "2024-09-20 21:29:27.989885", "delta": "0:00:01.220269", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882168.01242: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882168.01246: _low_level_execute_command(): starting 10896 1726882168.01249: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882166.2562199-11317-272562839633646/ > /dev/null 2>&1 && sleep 0' 10896 1726882168.02116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.02129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.02142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.02240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.04124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.04143: stdout chunk (state=3): >>><<< 10896 1726882168.04155: stderr chunk (state=3): >>><<< 10896 1726882168.04179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.04191: handler run complete 10896 1726882168.04223: Evaluated conditional (False): False 10896 1726882168.04250: attempt loop complete, returning result 10896 1726882168.04257: _execute() done 10896 1726882168.04301: dumping result to json 10896 1726882168.04304: done dumping result, returning 10896 1726882168.04307: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [12673a56-9f93-8b02-b216-000000000012] 10896 1726882168.04309: sending task result for task 12673a56-9f93-8b02-b216-000000000012 ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.220269", "end": "2024-09-20 21:29:27.989885", "rc": 0, "start": "2024-09-20 21:29:26.769616" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6947 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6947 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 10896 1726882168.04683: no more pending results, returning what we have 10896 1726882168.04686: results queue empty 10896 1726882168.04687: checking for any_errors_fatal 10896 1726882168.04699: done checking for any_errors_fatal 10896 1726882168.04700: checking for max_fail_percentage 10896 1726882168.04708: done checking for max_fail_percentage 10896 1726882168.04709: checking to see if all hosts have failed and the running result is not ok 10896 1726882168.04710: done checking to see if all hosts have failed 10896 1726882168.04711: getting the remaining hosts for this loop 10896 1726882168.04713: done getting the remaining hosts for this loop 10896 1726882168.04716: getting the next task for host managed_node2 10896 1726882168.04726: done getting next task for host managed_node2 10896 1726882168.04729: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10896 1726882168.04733: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882168.04736: getting variables 10896 1726882168.04738: in VariableManager get_vars() 10896 1726882168.04780: Calling all_inventory to load vars for managed_node2 10896 1726882168.04783: Calling groups_inventory to load vars for managed_node2 10896 1726882168.04786: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.04935: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.04939: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.04942: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.05282: done sending task result for task 12673a56-9f93-8b02-b216-000000000012 10896 1726882168.05286: WORKER PROCESS EXITING 10896 1726882168.05317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.05525: done with get_vars() 10896 1726882168.05536: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:28 -0400 (0:00:01.849) 0:00:09.623 ****** 10896 1726882168.05642: entering _queue_task() for managed_node2/include_tasks 10896 1726882168.05968: worker is 1 (out of 1 available) 10896 1726882168.05982: exiting _queue_task() for managed_node2/include_tasks 10896 1726882168.06000: done queuing things up, now waiting for results queue to drain 10896 1726882168.06001: waiting for pending results... 10896 1726882168.06195: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 10896 1726882168.06320: in run() - task 12673a56-9f93-8b02-b216-000000000016 10896 1726882168.06342: variable 'ansible_search_path' from source: unknown 10896 1726882168.06348: variable 'ansible_search_path' from source: unknown 10896 1726882168.06384: calling self._execute() 10896 1726882168.06475: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.06488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.06510: variable 'omit' from source: magic vars 10896 1726882168.06918: variable 'ansible_distribution_major_version' from source: facts 10896 1726882168.06935: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882168.06953: _execute() done 10896 1726882168.06961: dumping result to json 10896 1726882168.06969: done dumping result, returning 10896 1726882168.06980: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-8b02-b216-000000000016] 10896 1726882168.06999: sending task result for task 12673a56-9f93-8b02-b216-000000000016 10896 1726882168.07226: no more pending results, returning what we have 10896 1726882168.07231: in VariableManager get_vars() 10896 1726882168.07276: Calling all_inventory to load vars for managed_node2 10896 1726882168.07279: Calling groups_inventory to load vars for managed_node2 10896 1726882168.07282: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.07298: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.07301: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.07305: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.07589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.07840: done with get_vars() 10896 1726882168.07847: variable 'ansible_search_path' from source: unknown 10896 1726882168.07848: variable 'ansible_search_path' from source: unknown 10896 1726882168.07900: done sending task result for task 12673a56-9f93-8b02-b216-000000000016 10896 1726882168.07903: WORKER PROCESS EXITING 10896 1726882168.07931: we have included files to process 10896 1726882168.07933: generating all_blocks data 10896 1726882168.07934: done generating all_blocks data 10896 1726882168.07935: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.08047: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.08050: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.08514: done processing included file 10896 1726882168.08515: iterating over new_blocks loaded from include file 10896 1726882168.08517: in VariableManager get_vars() 10896 1726882168.08540: done with get_vars() 10896 1726882168.08542: filtering new block on tags 10896 1726882168.08557: done filtering new block on tags 10896 1726882168.08560: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 10896 1726882168.08565: extending task lists for all hosts with included blocks 10896 1726882168.08808: done extending task lists 10896 1726882168.08810: done processing included files 10896 1726882168.08811: results queue empty 10896 1726882168.08811: checking for any_errors_fatal 10896 1726882168.08817: done checking for any_errors_fatal 10896 1726882168.08818: checking for max_fail_percentage 10896 1726882168.08819: done checking for max_fail_percentage 10896 1726882168.08820: checking to see if all hosts have failed and the running result is not ok 10896 1726882168.08821: done checking to see if all hosts have failed 10896 1726882168.08822: getting the remaining hosts for this loop 10896 1726882168.08823: done getting the remaining hosts for this loop 10896 1726882168.08825: getting the next task for host managed_node2 10896 1726882168.08829: done getting next task for host managed_node2 10896 1726882168.08831: ^ task is: TASK: Get stat for interface {{ interface }} 10896 1726882168.08834: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882168.08836: getting variables 10896 1726882168.08837: in VariableManager get_vars() 10896 1726882168.08850: Calling all_inventory to load vars for managed_node2 10896 1726882168.08852: Calling groups_inventory to load vars for managed_node2 10896 1726882168.08896: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.08903: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.08906: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.08909: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.09157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.09572: done with get_vars() 10896 1726882168.09580: done getting variables 10896 1726882168.09767: variable 'interface' from source: task vars 10896 1726882168.09772: variable 'dhcp_interface1' from source: play vars 10896 1726882168.10049: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:28 -0400 (0:00:00.044) 0:00:09.667 ****** 10896 1726882168.10090: entering _queue_task() for managed_node2/stat 10896 1726882168.10708: worker is 1 (out of 1 available) 10896 1726882168.10720: exiting _queue_task() for managed_node2/stat 10896 1726882168.10731: done queuing things up, now waiting for results queue to drain 10896 1726882168.10732: waiting for pending results... 10896 1726882168.11059: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 10896 1726882168.11264: in run() - task 12673a56-9f93-8b02-b216-000000000153 10896 1726882168.11277: variable 'ansible_search_path' from source: unknown 10896 1726882168.11283: variable 'ansible_search_path' from source: unknown 10896 1726882168.11324: calling self._execute() 10896 1726882168.11433: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.11436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.11438: variable 'omit' from source: magic vars 10896 1726882168.11746: variable 'ansible_distribution_major_version' from source: facts 10896 1726882168.11768: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882168.11778: variable 'omit' from source: magic vars 10896 1726882168.11835: variable 'omit' from source: magic vars 10896 1726882168.11972: variable 'interface' from source: task vars 10896 1726882168.11975: variable 'dhcp_interface1' from source: play vars 10896 1726882168.12021: variable 'dhcp_interface1' from source: play vars 10896 1726882168.12045: variable 'omit' from source: magic vars 10896 1726882168.12098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882168.12189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882168.12196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882168.12199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.12202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.12232: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882168.12241: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.12249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.12356: Set connection var ansible_connection to ssh 10896 1726882168.12369: Set connection var ansible_timeout to 10 10896 1726882168.12376: Set connection var ansible_shell_type to sh 10896 1726882168.12389: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882168.12499: Set connection var ansible_shell_executable to /bin/sh 10896 1726882168.12504: Set connection var ansible_pipelining to False 10896 1726882168.12507: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.12513: variable 'ansible_connection' from source: unknown 10896 1726882168.12519: variable 'ansible_module_compression' from source: unknown 10896 1726882168.12521: variable 'ansible_shell_type' from source: unknown 10896 1726882168.12523: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.12525: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.12527: variable 'ansible_pipelining' from source: unknown 10896 1726882168.12530: variable 'ansible_timeout' from source: unknown 10896 1726882168.12532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.12709: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882168.12725: variable 'omit' from source: magic vars 10896 1726882168.12742: starting attempt loop 10896 1726882168.12750: running the handler 10896 1726882168.12767: _low_level_execute_command(): starting 10896 1726882168.12780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882168.13726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.13734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.13748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.13766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.13858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.15453: stdout chunk (state=3): >>>/root <<< 10896 1726882168.15552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.15606: stderr chunk (state=3): >>><<< 10896 1726882168.15623: stdout chunk (state=3): >>><<< 10896 1726882168.15719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.15723: _low_level_execute_command(): starting 10896 1726882168.15726: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092 `" && echo ansible-tmp-1726882168.156423-11411-64666454329092="` echo /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092 `" ) && sleep 0' 10896 1726882168.16287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.16290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.16386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.16411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.16463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.16533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.18380: stdout chunk (state=3): >>>ansible-tmp-1726882168.156423-11411-64666454329092=/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092 <<< 10896 1726882168.18554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.18558: stdout chunk (state=3): >>><<< 10896 1726882168.18561: stderr chunk (state=3): >>><<< 10896 1726882168.18578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882168.156423-11411-64666454329092=/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.18659: variable 'ansible_module_compression' from source: unknown 10896 1726882168.18708: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882168.18775: variable 'ansible_facts' from source: unknown 10896 1726882168.18856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py 10896 1726882168.19027: Sending initial data 10896 1726882168.19030: Sent initial data (151 bytes) 10896 1726882168.19683: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.19757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.19792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.19861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.21413: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882168.21491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882168.21562: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpzoikvlo6 /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py <<< 10896 1726882168.21565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py" <<< 10896 1726882168.21631: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpzoikvlo6" to remote "/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py" <<< 10896 1726882168.22602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.22606: stdout chunk (state=3): >>><<< 10896 1726882168.22608: stderr chunk (state=3): >>><<< 10896 1726882168.22610: done transferring module to remote 10896 1726882168.22612: _low_level_execute_command(): starting 10896 1726882168.22614: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/ /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py && sleep 0' 10896 1726882168.23163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.23179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.23201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.23222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882168.23327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.23346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.23364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.23460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.25175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.25215: stderr chunk (state=3): >>><<< 10896 1726882168.25233: stdout chunk (state=3): >>><<< 10896 1726882168.25254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.25289: _low_level_execute_command(): starting 10896 1726882168.25305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/AnsiballZ_stat.py && sleep 0' 10896 1726882168.26052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.26066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.26111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.26125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882168.26138: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882168.26214: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.26265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.26301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.26339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.26410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.41437: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26527, "dev": 23, "nlink": 1, "atime": 1726882166.776113, "mtime": 1726882166.776113, "ctime": 1726882166.776113, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882168.42751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882168.42754: stdout chunk (state=3): >>><<< 10896 1726882168.42756: stderr chunk (state=3): >>><<< 10896 1726882168.42758: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26527, "dev": 23, "nlink": 1, "atime": 1726882166.776113, "mtime": 1726882166.776113, "ctime": 1726882166.776113, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882168.42931: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882168.42978: _low_level_execute_command(): starting 10896 1726882168.43156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882168.156423-11411-64666454329092/ > /dev/null 2>&1 && sleep 0' 10896 1726882168.44203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.44206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.44209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882168.44211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882168.44214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.44258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.44283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.44536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.44626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.46428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.46457: stderr chunk (state=3): >>><<< 10896 1726882168.46492: stdout chunk (state=3): >>><<< 10896 1726882168.46545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.46569: handler run complete 10896 1726882168.46804: attempt loop complete, returning result 10896 1726882168.46808: _execute() done 10896 1726882168.46810: dumping result to json 10896 1726882168.46812: done dumping result, returning 10896 1726882168.46814: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [12673a56-9f93-8b02-b216-000000000153] 10896 1726882168.46816: sending task result for task 12673a56-9f93-8b02-b216-000000000153 ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882166.776113, "block_size": 4096, "blocks": 0, "ctime": 1726882166.776113, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26527, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882166.776113, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10896 1726882168.47051: no more pending results, returning what we have 10896 1726882168.47054: results queue empty 10896 1726882168.47055: checking for any_errors_fatal 10896 1726882168.47057: done checking for any_errors_fatal 10896 1726882168.47057: checking for max_fail_percentage 10896 1726882168.47059: done checking for max_fail_percentage 10896 1726882168.47060: checking to see if all hosts have failed and the running result is not ok 10896 1726882168.47061: done checking to see if all hosts have failed 10896 1726882168.47061: getting the remaining hosts for this loop 10896 1726882168.47063: done getting the remaining hosts for this loop 10896 1726882168.47067: getting the next task for host managed_node2 10896 1726882168.47075: done getting next task for host managed_node2 10896 1726882168.47078: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10896 1726882168.47081: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882168.47087: getting variables 10896 1726882168.47088: in VariableManager get_vars() 10896 1726882168.47133: Calling all_inventory to load vars for managed_node2 10896 1726882168.47137: Calling groups_inventory to load vars for managed_node2 10896 1726882168.47140: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.47152: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.47155: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.47159: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.48216: done sending task result for task 12673a56-9f93-8b02-b216-000000000153 10896 1726882168.48219: WORKER PROCESS EXITING 10896 1726882168.48242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.48559: done with get_vars() 10896 1726882168.48570: done getting variables 10896 1726882168.48788: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 10896 1726882168.49012: variable 'interface' from source: task vars 10896 1726882168.49017: variable 'dhcp_interface1' from source: play vars 10896 1726882168.49192: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:28 -0400 (0:00:00.391) 0:00:10.059 ****** 10896 1726882168.49290: entering _queue_task() for managed_node2/assert 10896 1726882168.49291: Creating lock for assert 10896 1726882168.49799: worker is 1 (out of 1 available) 10896 1726882168.49927: exiting _queue_task() for managed_node2/assert 10896 1726882168.49940: done queuing things up, now waiting for results queue to drain 10896 1726882168.49941: waiting for pending results... 10896 1726882168.50413: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 10896 1726882168.50541: in run() - task 12673a56-9f93-8b02-b216-000000000017 10896 1726882168.50716: variable 'ansible_search_path' from source: unknown 10896 1726882168.51001: variable 'ansible_search_path' from source: unknown 10896 1726882168.51005: calling self._execute() 10896 1726882168.51018: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.51028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.51040: variable 'omit' from source: magic vars 10896 1726882168.52034: variable 'ansible_distribution_major_version' from source: facts 10896 1726882168.52400: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882168.52403: variable 'omit' from source: magic vars 10896 1726882168.52406: variable 'omit' from source: magic vars 10896 1726882168.52469: variable 'interface' from source: task vars 10896 1726882168.53000: variable 'dhcp_interface1' from source: play vars 10896 1726882168.53004: variable 'dhcp_interface1' from source: play vars 10896 1726882168.53007: variable 'omit' from source: magic vars 10896 1726882168.53036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882168.53076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882168.53104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882168.53217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.53232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.53263: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882168.53700: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.53703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.53717: Set connection var ansible_connection to ssh 10896 1726882168.53728: Set connection var ansible_timeout to 10 10896 1726882168.53734: Set connection var ansible_shell_type to sh 10896 1726882168.53745: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882168.53754: Set connection var ansible_shell_executable to /bin/sh 10896 1726882168.53765: Set connection var ansible_pipelining to False 10896 1726882168.53797: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.54099: variable 'ansible_connection' from source: unknown 10896 1726882168.54102: variable 'ansible_module_compression' from source: unknown 10896 1726882168.54104: variable 'ansible_shell_type' from source: unknown 10896 1726882168.54106: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.54108: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.54109: variable 'ansible_pipelining' from source: unknown 10896 1726882168.54111: variable 'ansible_timeout' from source: unknown 10896 1726882168.54113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.54280: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882168.54500: variable 'omit' from source: magic vars 10896 1726882168.54504: starting attempt loop 10896 1726882168.54507: running the handler 10896 1726882168.54833: variable 'interface_stat' from source: set_fact 10896 1726882168.54858: Evaluated conditional (interface_stat.stat.exists): True 10896 1726882168.54869: handler run complete 10896 1726882168.54887: attempt loop complete, returning result 10896 1726882168.54899: _execute() done 10896 1726882168.55204: dumping result to json 10896 1726882168.55208: done dumping result, returning 10896 1726882168.55210: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [12673a56-9f93-8b02-b216-000000000017] 10896 1726882168.55212: sending task result for task 12673a56-9f93-8b02-b216-000000000017 10896 1726882168.55284: done sending task result for task 12673a56-9f93-8b02-b216-000000000017 10896 1726882168.55288: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882168.55343: no more pending results, returning what we have 10896 1726882168.55347: results queue empty 10896 1726882168.55348: checking for any_errors_fatal 10896 1726882168.55360: done checking for any_errors_fatal 10896 1726882168.55361: checking for max_fail_percentage 10896 1726882168.55363: done checking for max_fail_percentage 10896 1726882168.55364: checking to see if all hosts have failed and the running result is not ok 10896 1726882168.55365: done checking to see if all hosts have failed 10896 1726882168.55365: getting the remaining hosts for this loop 10896 1726882168.55367: done getting the remaining hosts for this loop 10896 1726882168.55370: getting the next task for host managed_node2 10896 1726882168.55381: done getting next task for host managed_node2 10896 1726882168.55383: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10896 1726882168.55387: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882168.55391: getting variables 10896 1726882168.55396: in VariableManager get_vars() 10896 1726882168.55443: Calling all_inventory to load vars for managed_node2 10896 1726882168.55446: Calling groups_inventory to load vars for managed_node2 10896 1726882168.55449: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.55462: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.55465: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.55468: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.56227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.56650: done with get_vars() 10896 1726882168.56661: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:28 -0400 (0:00:00.075) 0:00:10.135 ****** 10896 1726882168.56868: entering _queue_task() for managed_node2/include_tasks 10896 1726882168.57443: worker is 1 (out of 1 available) 10896 1726882168.57456: exiting _queue_task() for managed_node2/include_tasks 10896 1726882168.57469: done queuing things up, now waiting for results queue to drain 10896 1726882168.57470: waiting for pending results... 10896 1726882168.58285: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 10896 1726882168.58329: in run() - task 12673a56-9f93-8b02-b216-00000000001b 10896 1726882168.58347: variable 'ansible_search_path' from source: unknown 10896 1726882168.58535: variable 'ansible_search_path' from source: unknown 10896 1726882168.58538: calling self._execute() 10896 1726882168.58805: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.58816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.58829: variable 'omit' from source: magic vars 10896 1726882168.59634: variable 'ansible_distribution_major_version' from source: facts 10896 1726882168.59678: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882168.59687: _execute() done 10896 1726882168.59696: dumping result to json 10896 1726882168.59703: done dumping result, returning 10896 1726882168.59712: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-8b02-b216-00000000001b] 10896 1726882168.60100: sending task result for task 12673a56-9f93-8b02-b216-00000000001b 10896 1726882168.60164: done sending task result for task 12673a56-9f93-8b02-b216-00000000001b 10896 1726882168.60167: WORKER PROCESS EXITING 10896 1726882168.60190: no more pending results, returning what we have 10896 1726882168.60198: in VariableManager get_vars() 10896 1726882168.60239: Calling all_inventory to load vars for managed_node2 10896 1726882168.60242: Calling groups_inventory to load vars for managed_node2 10896 1726882168.60244: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.60254: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.60257: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.60261: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.60730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.61122: done with get_vars() 10896 1726882168.61129: variable 'ansible_search_path' from source: unknown 10896 1726882168.61131: variable 'ansible_search_path' from source: unknown 10896 1726882168.61163: we have included files to process 10896 1726882168.61165: generating all_blocks data 10896 1726882168.61166: done generating all_blocks data 10896 1726882168.61170: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.61172: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.61174: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882168.61580: done processing included file 10896 1726882168.61582: iterating over new_blocks loaded from include file 10896 1726882168.61584: in VariableManager get_vars() 10896 1726882168.61608: done with get_vars() 10896 1726882168.61610: filtering new block on tags 10896 1726882168.61626: done filtering new block on tags 10896 1726882168.61629: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 10896 1726882168.61634: extending task lists for all hosts with included blocks 10896 1726882168.61736: done extending task lists 10896 1726882168.61737: done processing included files 10896 1726882168.61738: results queue empty 10896 1726882168.61738: checking for any_errors_fatal 10896 1726882168.61741: done checking for any_errors_fatal 10896 1726882168.61742: checking for max_fail_percentage 10896 1726882168.61743: done checking for max_fail_percentage 10896 1726882168.61744: checking to see if all hosts have failed and the running result is not ok 10896 1726882168.61744: done checking to see if all hosts have failed 10896 1726882168.61745: getting the remaining hosts for this loop 10896 1726882168.61746: done getting the remaining hosts for this loop 10896 1726882168.61748: getting the next task for host managed_node2 10896 1726882168.61752: done getting next task for host managed_node2 10896 1726882168.61754: ^ task is: TASK: Get stat for interface {{ interface }} 10896 1726882168.61757: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882168.61759: getting variables 10896 1726882168.61760: in VariableManager get_vars() 10896 1726882168.61773: Calling all_inventory to load vars for managed_node2 10896 1726882168.61775: Calling groups_inventory to load vars for managed_node2 10896 1726882168.61777: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882168.61782: Calling all_plugins_play to load vars for managed_node2 10896 1726882168.61784: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882168.61787: Calling groups_plugins_play to load vars for managed_node2 10896 1726882168.62115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882168.62899: done with get_vars() 10896 1726882168.62907: done getting variables 10896 1726882168.63248: variable 'interface' from source: task vars 10896 1726882168.63253: variable 'dhcp_interface2' from source: play vars 10896 1726882168.63318: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:28 -0400 (0:00:00.064) 0:00:10.200 ****** 10896 1726882168.63349: entering _queue_task() for managed_node2/stat 10896 1726882168.64033: worker is 1 (out of 1 available) 10896 1726882168.64043: exiting _queue_task() for managed_node2/stat 10896 1726882168.64054: done queuing things up, now waiting for results queue to drain 10896 1726882168.64055: waiting for pending results... 10896 1726882168.64210: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 10896 1726882168.64428: in run() - task 12673a56-9f93-8b02-b216-00000000016b 10896 1726882168.64440: variable 'ansible_search_path' from source: unknown 10896 1726882168.64443: variable 'ansible_search_path' from source: unknown 10896 1726882168.64630: calling self._execute() 10896 1726882168.64709: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.64804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.64815: variable 'omit' from source: magic vars 10896 1726882168.65537: variable 'ansible_distribution_major_version' from source: facts 10896 1726882168.65698: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882168.65702: variable 'omit' from source: magic vars 10896 1726882168.65705: variable 'omit' from source: magic vars 10896 1726882168.65952: variable 'interface' from source: task vars 10896 1726882168.66053: variable 'dhcp_interface2' from source: play vars 10896 1726882168.66119: variable 'dhcp_interface2' from source: play vars 10896 1726882168.66144: variable 'omit' from source: magic vars 10896 1726882168.66301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882168.66487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882168.66491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882168.66495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.66501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882168.66504: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882168.66506: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.66508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.66800: Set connection var ansible_connection to ssh 10896 1726882168.66811: Set connection var ansible_timeout to 10 10896 1726882168.66819: Set connection var ansible_shell_type to sh 10896 1726882168.66833: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882168.66842: Set connection var ansible_shell_executable to /bin/sh 10896 1726882168.66852: Set connection var ansible_pipelining to False 10896 1726882168.66880: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.66925: variable 'ansible_connection' from source: unknown 10896 1726882168.67101: variable 'ansible_module_compression' from source: unknown 10896 1726882168.67104: variable 'ansible_shell_type' from source: unknown 10896 1726882168.67107: variable 'ansible_shell_executable' from source: unknown 10896 1726882168.67110: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882168.67114: variable 'ansible_pipelining' from source: unknown 10896 1726882168.67117: variable 'ansible_timeout' from source: unknown 10896 1726882168.67120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882168.67480: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882168.67500: variable 'omit' from source: magic vars 10896 1726882168.67511: starting attempt loop 10896 1726882168.67519: running the handler 10896 1726882168.67537: _low_level_execute_command(): starting 10896 1726882168.67550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882168.68651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.68669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.68708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.68723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.68742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882168.68828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.68849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.68948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.70555: stdout chunk (state=3): >>>/root <<< 10896 1726882168.70721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.70743: stdout chunk (state=3): >>><<< 10896 1726882168.70756: stderr chunk (state=3): >>><<< 10896 1726882168.71103: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.71107: _low_level_execute_command(): starting 10896 1726882168.71110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453 `" && echo ansible-tmp-1726882168.709219-11450-113200178503453="` echo /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453 `" ) && sleep 0' 10896 1726882168.72063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.72075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882168.72192: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882168.72310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.72416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.72654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.74367: stdout chunk (state=3): >>>ansible-tmp-1726882168.709219-11450-113200178503453=/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453 <<< 10896 1726882168.74462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.74506: stderr chunk (state=3): >>><<< 10896 1726882168.74516: stdout chunk (state=3): >>><<< 10896 1726882168.74616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882168.709219-11450-113200178503453=/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.74670: variable 'ansible_module_compression' from source: unknown 10896 1726882168.74787: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882168.74839: variable 'ansible_facts' from source: unknown 10896 1726882168.74999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py 10896 1726882168.75404: Sending initial data 10896 1726882168.75418: Sent initial data (152 bytes) 10896 1726882168.76807: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.76879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.77038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.77128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.78799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882168.78855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882168.78913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpa65imws8 /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py <<< 10896 1726882168.78924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py" <<< 10896 1726882168.79129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpa65imws8" to remote "/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py" <<< 10896 1726882168.80329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.80409: stderr chunk (state=3): >>><<< 10896 1726882168.80431: stdout chunk (state=3): >>><<< 10896 1726882168.80475: done transferring module to remote 10896 1726882168.80491: _low_level_execute_command(): starting 10896 1726882168.80506: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/ /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py && sleep 0' 10896 1726882168.81146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.81172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.81188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.81278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.81324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.81342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.81364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.81532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.83283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882168.83287: stdout chunk (state=3): >>><<< 10896 1726882168.83289: stderr chunk (state=3): >>><<< 10896 1726882168.83308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882168.83317: _low_level_execute_command(): starting 10896 1726882168.83390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/AnsiballZ_stat.py && sleep 0' 10896 1726882168.83907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882168.83922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882168.83939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882168.83959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882168.83976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882168.83987: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882168.84209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882168.84245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882168.84261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882168.84279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882168.84374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882168.99296: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26933, "dev": 23, "nlink": 1, "atime": 1726882166.779362, "mtime": 1726882166.779362, "ctime": 1726882166.779362, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882169.00496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882169.00511: stdout chunk (state=3): >>><<< 10896 1726882169.00524: stderr chunk (state=3): >>><<< 10896 1726882169.00556: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26933, "dev": 23, "nlink": 1, "atime": 1726882166.779362, "mtime": 1726882166.779362, "ctime": 1726882166.779362, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882169.00615: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882169.00633: _low_level_execute_command(): starting 10896 1726882169.00700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882168.709219-11450-113200178503453/ > /dev/null 2>&1 && sleep 0' 10896 1726882169.01317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882169.01333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882169.01376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882169.01390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882169.01484: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882169.01532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882169.01582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882169.03470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882169.03474: stdout chunk (state=3): >>><<< 10896 1726882169.03477: stderr chunk (state=3): >>><<< 10896 1726882169.03700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882169.03704: handler run complete 10896 1726882169.03707: attempt loop complete, returning result 10896 1726882169.03709: _execute() done 10896 1726882169.03711: dumping result to json 10896 1726882169.03712: done dumping result, returning 10896 1726882169.03714: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [12673a56-9f93-8b02-b216-00000000016b] 10896 1726882169.03716: sending task result for task 12673a56-9f93-8b02-b216-00000000016b 10896 1726882169.03787: done sending task result for task 12673a56-9f93-8b02-b216-00000000016b 10896 1726882169.03790: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882166.779362, "block_size": 4096, "blocks": 0, "ctime": 1726882166.779362, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26933, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882166.779362, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10896 1726882169.03885: no more pending results, returning what we have 10896 1726882169.03888: results queue empty 10896 1726882169.03889: checking for any_errors_fatal 10896 1726882169.03890: done checking for any_errors_fatal 10896 1726882169.03891: checking for max_fail_percentage 10896 1726882169.03896: done checking for max_fail_percentage 10896 1726882169.03897: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.04097: done checking to see if all hosts have failed 10896 1726882169.04098: getting the remaining hosts for this loop 10896 1726882169.04100: done getting the remaining hosts for this loop 10896 1726882169.04104: getting the next task for host managed_node2 10896 1726882169.04111: done getting next task for host managed_node2 10896 1726882169.04113: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10896 1726882169.04115: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.04119: getting variables 10896 1726882169.04120: in VariableManager get_vars() 10896 1726882169.04155: Calling all_inventory to load vars for managed_node2 10896 1726882169.04158: Calling groups_inventory to load vars for managed_node2 10896 1726882169.04160: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.04168: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.04170: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.04173: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.04354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.04573: done with get_vars() 10896 1726882169.04584: done getting variables 10896 1726882169.04650: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882169.04776: variable 'interface' from source: task vars 10896 1726882169.04780: variable 'dhcp_interface2' from source: play vars 10896 1726882169.04839: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:29 -0400 (0:00:00.415) 0:00:10.615 ****** 10896 1726882169.04881: entering _queue_task() for managed_node2/assert 10896 1726882169.05151: worker is 1 (out of 1 available) 10896 1726882169.05164: exiting _queue_task() for managed_node2/assert 10896 1726882169.05289: done queuing things up, now waiting for results queue to drain 10896 1726882169.05291: waiting for pending results... 10896 1726882169.05518: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 10896 1726882169.05587: in run() - task 12673a56-9f93-8b02-b216-00000000001c 10896 1726882169.05634: variable 'ansible_search_path' from source: unknown 10896 1726882169.05638: variable 'ansible_search_path' from source: unknown 10896 1726882169.05670: calling self._execute() 10896 1726882169.05800: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.05804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.05807: variable 'omit' from source: magic vars 10896 1726882169.06261: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.06285: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.06304: variable 'omit' from source: magic vars 10896 1726882169.06379: variable 'omit' from source: magic vars 10896 1726882169.06490: variable 'interface' from source: task vars 10896 1726882169.06499: variable 'dhcp_interface2' from source: play vars 10896 1726882169.06539: variable 'dhcp_interface2' from source: play vars 10896 1726882169.06564: variable 'omit' from source: magic vars 10896 1726882169.06617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882169.06655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882169.06680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882169.06714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.06731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.06765: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882169.06775: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.06784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.06896: Set connection var ansible_connection to ssh 10896 1726882169.06928: Set connection var ansible_timeout to 10 10896 1726882169.06931: Set connection var ansible_shell_type to sh 10896 1726882169.06934: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882169.07000: Set connection var ansible_shell_executable to /bin/sh 10896 1726882169.07003: Set connection var ansible_pipelining to False 10896 1726882169.07006: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.07008: variable 'ansible_connection' from source: unknown 10896 1726882169.07010: variable 'ansible_module_compression' from source: unknown 10896 1726882169.07013: variable 'ansible_shell_type' from source: unknown 10896 1726882169.07015: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.07017: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.07019: variable 'ansible_pipelining' from source: unknown 10896 1726882169.07021: variable 'ansible_timeout' from source: unknown 10896 1726882169.07023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.07166: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882169.07180: variable 'omit' from source: magic vars 10896 1726882169.07187: starting attempt loop 10896 1726882169.07196: running the handler 10896 1726882169.07324: variable 'interface_stat' from source: set_fact 10896 1726882169.07345: Evaluated conditional (interface_stat.stat.exists): True 10896 1726882169.07353: handler run complete 10896 1726882169.07472: attempt loop complete, returning result 10896 1726882169.07474: _execute() done 10896 1726882169.07477: dumping result to json 10896 1726882169.07479: done dumping result, returning 10896 1726882169.07481: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [12673a56-9f93-8b02-b216-00000000001c] 10896 1726882169.07482: sending task result for task 12673a56-9f93-8b02-b216-00000000001c 10896 1726882169.07548: done sending task result for task 12673a56-9f93-8b02-b216-00000000001c 10896 1726882169.07551: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882169.07598: no more pending results, returning what we have 10896 1726882169.07600: results queue empty 10896 1726882169.07601: checking for any_errors_fatal 10896 1726882169.07608: done checking for any_errors_fatal 10896 1726882169.07609: checking for max_fail_percentage 10896 1726882169.07611: done checking for max_fail_percentage 10896 1726882169.07612: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.07613: done checking to see if all hosts have failed 10896 1726882169.07614: getting the remaining hosts for this loop 10896 1726882169.07615: done getting the remaining hosts for this loop 10896 1726882169.07618: getting the next task for host managed_node2 10896 1726882169.07625: done getting next task for host managed_node2 10896 1726882169.07627: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 10896 1726882169.07629: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.07633: getting variables 10896 1726882169.07634: in VariableManager get_vars() 10896 1726882169.07674: Calling all_inventory to load vars for managed_node2 10896 1726882169.07676: Calling groups_inventory to load vars for managed_node2 10896 1726882169.07679: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.07688: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.07691: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.07697: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.08116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.08317: done with get_vars() 10896 1726882169.08327: done getting variables 10896 1726882169.08386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Friday 20 September 2024 21:29:29 -0400 (0:00:00.035) 0:00:10.651 ****** 10896 1726882169.08417: entering _queue_task() for managed_node2/command 10896 1726882169.08657: worker is 1 (out of 1 available) 10896 1726882169.08782: exiting _queue_task() for managed_node2/command 10896 1726882169.08798: done queuing things up, now waiting for results queue to drain 10896 1726882169.08800: waiting for pending results... 10896 1726882169.09013: running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript 10896 1726882169.09101: in run() - task 12673a56-9f93-8b02-b216-00000000001d 10896 1726882169.09110: variable 'ansible_search_path' from source: unknown 10896 1726882169.09113: calling self._execute() 10896 1726882169.09187: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.09202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.09220: variable 'omit' from source: magic vars 10896 1726882169.09580: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.09600: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.09725: variable 'network_provider' from source: set_fact 10896 1726882169.09735: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882169.09742: when evaluation is False, skipping this task 10896 1726882169.09748: _execute() done 10896 1726882169.09870: dumping result to json 10896 1726882169.09874: done dumping result, returning 10896 1726882169.09877: done running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript [12673a56-9f93-8b02-b216-00000000001d] 10896 1726882169.09880: sending task result for task 12673a56-9f93-8b02-b216-00000000001d 10896 1726882169.09946: done sending task result for task 12673a56-9f93-8b02-b216-00000000001d 10896 1726882169.09949: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10896 1726882169.10001: no more pending results, returning what we have 10896 1726882169.10004: results queue empty 10896 1726882169.10005: checking for any_errors_fatal 10896 1726882169.10010: done checking for any_errors_fatal 10896 1726882169.10011: checking for max_fail_percentage 10896 1726882169.10012: done checking for max_fail_percentage 10896 1726882169.10013: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.10014: done checking to see if all hosts have failed 10896 1726882169.10014: getting the remaining hosts for this loop 10896 1726882169.10015: done getting the remaining hosts for this loop 10896 1726882169.10018: getting the next task for host managed_node2 10896 1726882169.10025: done getting next task for host managed_node2 10896 1726882169.10027: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 10896 1726882169.10029: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.10032: getting variables 10896 1726882169.10034: in VariableManager get_vars() 10896 1726882169.10070: Calling all_inventory to load vars for managed_node2 10896 1726882169.10073: Calling groups_inventory to load vars for managed_node2 10896 1726882169.10075: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.10204: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.10208: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.10212: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.10429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.10638: done with get_vars() 10896 1726882169.10648: done getting variables 10896 1726882169.10708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Friday 20 September 2024 21:29:29 -0400 (0:00:00.023) 0:00:10.674 ****** 10896 1726882169.10734: entering _queue_task() for managed_node2/debug 10896 1726882169.11081: worker is 1 (out of 1 available) 10896 1726882169.11091: exiting _queue_task() for managed_node2/debug 10896 1726882169.11106: done queuing things up, now waiting for results queue to drain 10896 1726882169.11107: waiting for pending results... 10896 1726882169.11309: running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 10896 1726882169.11404: in run() - task 12673a56-9f93-8b02-b216-00000000001e 10896 1726882169.11408: variable 'ansible_search_path' from source: unknown 10896 1726882169.11431: calling self._execute() 10896 1726882169.11532: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.11599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.11603: variable 'omit' from source: magic vars 10896 1726882169.12025: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.12042: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.12059: variable 'omit' from source: magic vars 10896 1726882169.12082: variable 'omit' from source: magic vars 10896 1726882169.12131: variable 'omit' from source: magic vars 10896 1726882169.12178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882169.12226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882169.12275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882169.12278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.12298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.12383: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882169.12386: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.12388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.12446: Set connection var ansible_connection to ssh 10896 1726882169.12456: Set connection var ansible_timeout to 10 10896 1726882169.12461: Set connection var ansible_shell_type to sh 10896 1726882169.12472: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882169.12479: Set connection var ansible_shell_executable to /bin/sh 10896 1726882169.12491: Set connection var ansible_pipelining to False 10896 1726882169.12521: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.12530: variable 'ansible_connection' from source: unknown 10896 1726882169.12536: variable 'ansible_module_compression' from source: unknown 10896 1726882169.12542: variable 'ansible_shell_type' from source: unknown 10896 1726882169.12602: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.12605: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.12608: variable 'ansible_pipelining' from source: unknown 10896 1726882169.12609: variable 'ansible_timeout' from source: unknown 10896 1726882169.12612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.12702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882169.12721: variable 'omit' from source: magic vars 10896 1726882169.12729: starting attempt loop 10896 1726882169.12735: running the handler 10896 1726882169.12781: handler run complete 10896 1726882169.12805: attempt loop complete, returning result 10896 1726882169.12811: _execute() done 10896 1726882169.12823: dumping result to json 10896 1726882169.12828: done dumping result, returning 10896 1726882169.12837: done running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [12673a56-9f93-8b02-b216-00000000001e] 10896 1726882169.12926: sending task result for task 12673a56-9f93-8b02-b216-00000000001e 10896 1726882169.12987: done sending task result for task 12673a56-9f93-8b02-b216-00000000001e 10896 1726882169.12991: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 10896 1726882169.13112: no more pending results, returning what we have 10896 1726882169.13115: results queue empty 10896 1726882169.13116: checking for any_errors_fatal 10896 1726882169.13120: done checking for any_errors_fatal 10896 1726882169.13121: checking for max_fail_percentage 10896 1726882169.13123: done checking for max_fail_percentage 10896 1726882169.13124: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.13125: done checking to see if all hosts have failed 10896 1726882169.13125: getting the remaining hosts for this loop 10896 1726882169.13127: done getting the remaining hosts for this loop 10896 1726882169.13130: getting the next task for host managed_node2 10896 1726882169.13143: done getting next task for host managed_node2 10896 1726882169.13148: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10896 1726882169.13151: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.13166: getting variables 10896 1726882169.13167: in VariableManager get_vars() 10896 1726882169.13213: Calling all_inventory to load vars for managed_node2 10896 1726882169.13216: Calling groups_inventory to load vars for managed_node2 10896 1726882169.13219: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.13230: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.13233: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.13236: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.13564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.13818: done with get_vars() 10896 1726882169.13828: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:29:29 -0400 (0:00:00.031) 0:00:10.706 ****** 10896 1726882169.13928: entering _queue_task() for managed_node2/include_tasks 10896 1726882169.14301: worker is 1 (out of 1 available) 10896 1726882169.14312: exiting _queue_task() for managed_node2/include_tasks 10896 1726882169.14323: done queuing things up, now waiting for results queue to drain 10896 1726882169.14324: waiting for pending results... 10896 1726882169.14522: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10896 1726882169.14617: in run() - task 12673a56-9f93-8b02-b216-000000000026 10896 1726882169.14636: variable 'ansible_search_path' from source: unknown 10896 1726882169.14657: variable 'ansible_search_path' from source: unknown 10896 1726882169.14687: calling self._execute() 10896 1726882169.14799: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.14803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.14805: variable 'omit' from source: magic vars 10896 1726882169.15153: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.15175: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.15184: _execute() done 10896 1726882169.15201: dumping result to json 10896 1726882169.15203: done dumping result, returning 10896 1726882169.15271: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-8b02-b216-000000000026] 10896 1726882169.15274: sending task result for task 12673a56-9f93-8b02-b216-000000000026 10896 1726882169.15346: done sending task result for task 12673a56-9f93-8b02-b216-000000000026 10896 1726882169.15350: WORKER PROCESS EXITING 10896 1726882169.15508: no more pending results, returning what we have 10896 1726882169.15513: in VariableManager get_vars() 10896 1726882169.15554: Calling all_inventory to load vars for managed_node2 10896 1726882169.15557: Calling groups_inventory to load vars for managed_node2 10896 1726882169.15560: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.15569: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.15572: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.15575: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.15828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.16062: done with get_vars() 10896 1726882169.16070: variable 'ansible_search_path' from source: unknown 10896 1726882169.16072: variable 'ansible_search_path' from source: unknown 10896 1726882169.16114: we have included files to process 10896 1726882169.16115: generating all_blocks data 10896 1726882169.16117: done generating all_blocks data 10896 1726882169.16121: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882169.16122: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882169.16124: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882169.16850: done processing included file 10896 1726882169.16852: iterating over new_blocks loaded from include file 10896 1726882169.16853: in VariableManager get_vars() 10896 1726882169.16877: done with get_vars() 10896 1726882169.16879: filtering new block on tags 10896 1726882169.16899: done filtering new block on tags 10896 1726882169.16907: in VariableManager get_vars() 10896 1726882169.16930: done with get_vars() 10896 1726882169.16932: filtering new block on tags 10896 1726882169.16952: done filtering new block on tags 10896 1726882169.16954: in VariableManager get_vars() 10896 1726882169.16977: done with get_vars() 10896 1726882169.16979: filtering new block on tags 10896 1726882169.17000: done filtering new block on tags 10896 1726882169.17003: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 10896 1726882169.17007: extending task lists for all hosts with included blocks 10896 1726882169.17848: done extending task lists 10896 1726882169.17850: done processing included files 10896 1726882169.17851: results queue empty 10896 1726882169.17851: checking for any_errors_fatal 10896 1726882169.17854: done checking for any_errors_fatal 10896 1726882169.17855: checking for max_fail_percentage 10896 1726882169.17856: done checking for max_fail_percentage 10896 1726882169.17856: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.17857: done checking to see if all hosts have failed 10896 1726882169.17858: getting the remaining hosts for this loop 10896 1726882169.17859: done getting the remaining hosts for this loop 10896 1726882169.17861: getting the next task for host managed_node2 10896 1726882169.17864: done getting next task for host managed_node2 10896 1726882169.17866: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10896 1726882169.17869: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.17881: getting variables 10896 1726882169.17882: in VariableManager get_vars() 10896 1726882169.17899: Calling all_inventory to load vars for managed_node2 10896 1726882169.17902: Calling groups_inventory to load vars for managed_node2 10896 1726882169.17903: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.17908: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.17910: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.17913: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.18071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.18255: done with get_vars() 10896 1726882169.18264: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:29:29 -0400 (0:00:00.044) 0:00:10.750 ****** 10896 1726882169.18340: entering _queue_task() for managed_node2/setup 10896 1726882169.18673: worker is 1 (out of 1 available) 10896 1726882169.18683: exiting _queue_task() for managed_node2/setup 10896 1726882169.18698: done queuing things up, now waiting for results queue to drain 10896 1726882169.18699: waiting for pending results... 10896 1726882169.18935: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10896 1726882169.19034: in run() - task 12673a56-9f93-8b02-b216-000000000189 10896 1726882169.19038: variable 'ansible_search_path' from source: unknown 10896 1726882169.19040: variable 'ansible_search_path' from source: unknown 10896 1726882169.19078: calling self._execute() 10896 1726882169.19161: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.19185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.19296: variable 'omit' from source: magic vars 10896 1726882169.19558: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.19572: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.19782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882169.21929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882169.21998: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882169.22046: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882169.22085: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882169.22139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882169.22222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882169.22264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882169.22349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882169.22352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882169.22372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882169.22434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882169.22470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882169.22500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882169.22540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882169.22556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882169.22709: variable '__network_required_facts' from source: role '' defaults 10896 1726882169.22781: variable 'ansible_facts' from source: unknown 10896 1726882169.22816: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10896 1726882169.22825: when evaluation is False, skipping this task 10896 1726882169.22831: _execute() done 10896 1726882169.22837: dumping result to json 10896 1726882169.22843: done dumping result, returning 10896 1726882169.22852: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-8b02-b216-000000000189] 10896 1726882169.22860: sending task result for task 12673a56-9f93-8b02-b216-000000000189 10896 1726882169.23048: done sending task result for task 12673a56-9f93-8b02-b216-000000000189 10896 1726882169.23051: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882169.23096: no more pending results, returning what we have 10896 1726882169.23100: results queue empty 10896 1726882169.23101: checking for any_errors_fatal 10896 1726882169.23102: done checking for any_errors_fatal 10896 1726882169.23103: checking for max_fail_percentage 10896 1726882169.23104: done checking for max_fail_percentage 10896 1726882169.23105: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.23106: done checking to see if all hosts have failed 10896 1726882169.23106: getting the remaining hosts for this loop 10896 1726882169.23108: done getting the remaining hosts for this loop 10896 1726882169.23111: getting the next task for host managed_node2 10896 1726882169.23120: done getting next task for host managed_node2 10896 1726882169.23123: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10896 1726882169.23127: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.23139: getting variables 10896 1726882169.23140: in VariableManager get_vars() 10896 1726882169.23337: Calling all_inventory to load vars for managed_node2 10896 1726882169.23340: Calling groups_inventory to load vars for managed_node2 10896 1726882169.23343: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.23351: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.23354: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.23357: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.23539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.23777: done with get_vars() 10896 1726882169.23786: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:29:29 -0400 (0:00:00.055) 0:00:10.805 ****** 10896 1726882169.23887: entering _queue_task() for managed_node2/stat 10896 1726882169.24121: worker is 1 (out of 1 available) 10896 1726882169.24134: exiting _queue_task() for managed_node2/stat 10896 1726882169.24306: done queuing things up, now waiting for results queue to drain 10896 1726882169.24308: waiting for pending results... 10896 1726882169.24439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10896 1726882169.24548: in run() - task 12673a56-9f93-8b02-b216-00000000018b 10896 1726882169.24566: variable 'ansible_search_path' from source: unknown 10896 1726882169.24599: variable 'ansible_search_path' from source: unknown 10896 1726882169.24613: calling self._execute() 10896 1726882169.24691: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.24708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.24753: variable 'omit' from source: magic vars 10896 1726882169.25069: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.25098: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.25253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882169.25630: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882169.25633: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882169.25636: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882169.25672: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882169.25766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882169.25801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882169.25835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882169.25874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882169.25970: variable '__network_is_ostree' from source: set_fact 10896 1726882169.25982: Evaluated conditional (not __network_is_ostree is defined): False 10896 1726882169.25989: when evaluation is False, skipping this task 10896 1726882169.26003: _execute() done 10896 1726882169.26011: dumping result to json 10896 1726882169.26019: done dumping result, returning 10896 1726882169.26032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-8b02-b216-00000000018b] 10896 1726882169.26042: sending task result for task 12673a56-9f93-8b02-b216-00000000018b skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10896 1726882169.26220: no more pending results, returning what we have 10896 1726882169.26223: results queue empty 10896 1726882169.26224: checking for any_errors_fatal 10896 1726882169.26231: done checking for any_errors_fatal 10896 1726882169.26232: checking for max_fail_percentage 10896 1726882169.26233: done checking for max_fail_percentage 10896 1726882169.26234: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.26235: done checking to see if all hosts have failed 10896 1726882169.26236: getting the remaining hosts for this loop 10896 1726882169.26238: done getting the remaining hosts for this loop 10896 1726882169.26241: getting the next task for host managed_node2 10896 1726882169.26248: done getting next task for host managed_node2 10896 1726882169.26251: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10896 1726882169.26255: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.26269: getting variables 10896 1726882169.26270: in VariableManager get_vars() 10896 1726882169.26314: Calling all_inventory to load vars for managed_node2 10896 1726882169.26317: Calling groups_inventory to load vars for managed_node2 10896 1726882169.26320: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.26330: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.26333: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.26336: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.26674: done sending task result for task 12673a56-9f93-8b02-b216-00000000018b 10896 1726882169.26678: WORKER PROCESS EXITING 10896 1726882169.26707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.26921: done with get_vars() 10896 1726882169.26931: done getting variables 10896 1726882169.26981: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:29:29 -0400 (0:00:00.031) 0:00:10.837 ****** 10896 1726882169.27020: entering _queue_task() for managed_node2/set_fact 10896 1726882169.27313: worker is 1 (out of 1 available) 10896 1726882169.27324: exiting _queue_task() for managed_node2/set_fact 10896 1726882169.27338: done queuing things up, now waiting for results queue to drain 10896 1726882169.27340: waiting for pending results... 10896 1726882169.27508: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10896 1726882169.27639: in run() - task 12673a56-9f93-8b02-b216-00000000018c 10896 1726882169.27662: variable 'ansible_search_path' from source: unknown 10896 1726882169.27669: variable 'ansible_search_path' from source: unknown 10896 1726882169.27708: calling self._execute() 10896 1726882169.27782: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.27797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.27810: variable 'omit' from source: magic vars 10896 1726882169.28142: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.28158: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.28333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882169.28678: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882169.28730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882169.28771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882169.28855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882169.28905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882169.28940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882169.28975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882169.29011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882169.29107: variable '__network_is_ostree' from source: set_fact 10896 1726882169.29119: Evaluated conditional (not __network_is_ostree is defined): False 10896 1726882169.29143: when evaluation is False, skipping this task 10896 1726882169.29146: _execute() done 10896 1726882169.29148: dumping result to json 10896 1726882169.29150: done dumping result, returning 10896 1726882169.29182: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-8b02-b216-00000000018c] 10896 1726882169.29185: sending task result for task 12673a56-9f93-8b02-b216-00000000018c skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10896 1726882169.29434: no more pending results, returning what we have 10896 1726882169.29436: results queue empty 10896 1726882169.29437: checking for any_errors_fatal 10896 1726882169.29440: done checking for any_errors_fatal 10896 1726882169.29441: checking for max_fail_percentage 10896 1726882169.29442: done checking for max_fail_percentage 10896 1726882169.29443: checking to see if all hosts have failed and the running result is not ok 10896 1726882169.29444: done checking to see if all hosts have failed 10896 1726882169.29444: getting the remaining hosts for this loop 10896 1726882169.29446: done getting the remaining hosts for this loop 10896 1726882169.29449: getting the next task for host managed_node2 10896 1726882169.29456: done getting next task for host managed_node2 10896 1726882169.29459: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10896 1726882169.29461: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882169.29472: getting variables 10896 1726882169.29473: in VariableManager get_vars() 10896 1726882169.29513: Calling all_inventory to load vars for managed_node2 10896 1726882169.29516: Calling groups_inventory to load vars for managed_node2 10896 1726882169.29518: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882169.29524: done sending task result for task 12673a56-9f93-8b02-b216-00000000018c 10896 1726882169.29527: WORKER PROCESS EXITING 10896 1726882169.29536: Calling all_plugins_play to load vars for managed_node2 10896 1726882169.29538: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882169.29541: Calling groups_plugins_play to load vars for managed_node2 10896 1726882169.29813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882169.30031: done with get_vars() 10896 1726882169.30039: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:29:29 -0400 (0:00:00.031) 0:00:10.868 ****** 10896 1726882169.30130: entering _queue_task() for managed_node2/service_facts 10896 1726882169.30132: Creating lock for service_facts 10896 1726882169.30414: worker is 1 (out of 1 available) 10896 1726882169.30427: exiting _queue_task() for managed_node2/service_facts 10896 1726882169.30439: done queuing things up, now waiting for results queue to drain 10896 1726882169.30440: waiting for pending results... 10896 1726882169.30658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 10896 1726882169.30791: in run() - task 12673a56-9f93-8b02-b216-00000000018e 10896 1726882169.30824: variable 'ansible_search_path' from source: unknown 10896 1726882169.30834: variable 'ansible_search_path' from source: unknown 10896 1726882169.30872: calling self._execute() 10896 1726882169.30960: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.30972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.30987: variable 'omit' from source: magic vars 10896 1726882169.31349: variable 'ansible_distribution_major_version' from source: facts 10896 1726882169.31373: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882169.31384: variable 'omit' from source: magic vars 10896 1726882169.31457: variable 'omit' from source: magic vars 10896 1726882169.31507: variable 'omit' from source: magic vars 10896 1726882169.31549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882169.31699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882169.31703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882169.31705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.31707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882169.31710: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882169.31712: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.31714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.31808: Set connection var ansible_connection to ssh 10896 1726882169.31821: Set connection var ansible_timeout to 10 10896 1726882169.31828: Set connection var ansible_shell_type to sh 10896 1726882169.31841: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882169.31853: Set connection var ansible_shell_executable to /bin/sh 10896 1726882169.31864: Set connection var ansible_pipelining to False 10896 1726882169.31891: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.31910: variable 'ansible_connection' from source: unknown 10896 1726882169.31922: variable 'ansible_module_compression' from source: unknown 10896 1726882169.32022: variable 'ansible_shell_type' from source: unknown 10896 1726882169.32025: variable 'ansible_shell_executable' from source: unknown 10896 1726882169.32027: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882169.32030: variable 'ansible_pipelining' from source: unknown 10896 1726882169.32032: variable 'ansible_timeout' from source: unknown 10896 1726882169.32034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882169.32167: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882169.32185: variable 'omit' from source: magic vars 10896 1726882169.32199: starting attempt loop 10896 1726882169.32207: running the handler 10896 1726882169.32225: _low_level_execute_command(): starting 10896 1726882169.32244: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882169.33015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882169.33065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882169.33078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882169.33108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882169.33207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882169.34829: stdout chunk (state=3): >>>/root <<< 10896 1726882169.34987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882169.34991: stdout chunk (state=3): >>><<< 10896 1726882169.34998: stderr chunk (state=3): >>><<< 10896 1726882169.35016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882169.35111: _low_level_execute_command(): starting 10896 1726882169.35115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793 `" && echo ansible-tmp-1726882169.3502276-11485-89568301835793="` echo /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793 `" ) && sleep 0' 10896 1726882169.35685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882169.35705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882169.35808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882169.37705: stdout chunk (state=3): >>>ansible-tmp-1726882169.3502276-11485-89568301835793=/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793 <<< 10896 1726882169.37806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882169.37824: stderr chunk (state=3): >>><<< 10896 1726882169.37838: stdout chunk (state=3): >>><<< 10896 1726882169.37859: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882169.3502276-11485-89568301835793=/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882169.38000: variable 'ansible_module_compression' from source: unknown 10896 1726882169.38005: ANSIBALLZ: Using lock for service_facts 10896 1726882169.38008: ANSIBALLZ: Acquiring lock 10896 1726882169.38011: ANSIBALLZ: Lock acquired: 139646157888240 10896 1726882169.38014: ANSIBALLZ: Creating module 10896 1726882169.52041: ANSIBALLZ: Writing module into payload 10896 1726882169.52702: ANSIBALLZ: Writing module 10896 1726882169.52706: ANSIBALLZ: Renaming module 10896 1726882169.52709: ANSIBALLZ: Done creating module 10896 1726882169.52711: variable 'ansible_facts' from source: unknown 10896 1726882169.52714: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py 10896 1726882169.53247: Sending initial data 10896 1726882169.53251: Sent initial data (161 bytes) 10896 1726882169.54710: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882169.54724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882169.54747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882169.54839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882169.56417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882169.56479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882169.56540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp49egmw3q /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py <<< 10896 1726882169.56604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp49egmw3q" to remote "/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py" <<< 10896 1726882169.56618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py" <<< 10896 1726882169.57930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882169.58046: stderr chunk (state=3): >>><<< 10896 1726882169.58049: stdout chunk (state=3): >>><<< 10896 1726882169.58059: done transferring module to remote 10896 1726882169.58074: _low_level_execute_command(): starting 10896 1726882169.58083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/ /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py && sleep 0' 10896 1726882169.59347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882169.59350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882169.59353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882169.59355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882169.59357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882169.59359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882169.59612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882169.59627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882169.61342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882169.61596: stderr chunk (state=3): >>><<< 10896 1726882169.61599: stdout chunk (state=3): >>><<< 10896 1726882169.61602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882169.61604: _low_level_execute_command(): starting 10896 1726882169.61607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/AnsiballZ_service_facts.py && sleep 0' 10896 1726882169.62922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882169.62926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882169.62929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882169.63038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882169.63060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882169.63608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.15561: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 10896 1726882171.15622: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 10896 1726882171.15643: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10896 1726882171.17118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882171.17121: stdout chunk (state=3): >>><<< 10896 1726882171.17130: stderr chunk (state=3): >>><<< 10896 1726882171.17151: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882171.18935: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882171.18945: _low_level_execute_command(): starting 10896 1726882171.18950: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882169.3502276-11485-89568301835793/ > /dev/null 2>&1 && sleep 0' 10896 1726882171.19596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882171.19609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.19621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882171.19636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882171.19649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882171.19656: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882171.19666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.19686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882171.19695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882171.19706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882171.19844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882171.19848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882171.19850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882171.19918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.21686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882171.21734: stderr chunk (state=3): >>><<< 10896 1726882171.21756: stdout chunk (state=3): >>><<< 10896 1726882171.21901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882171.21904: handler run complete 10896 1726882171.21976: variable 'ansible_facts' from source: unknown 10896 1726882171.22145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882171.22635: variable 'ansible_facts' from source: unknown 10896 1726882171.22806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882171.23012: attempt loop complete, returning result 10896 1726882171.23022: _execute() done 10896 1726882171.23029: dumping result to json 10896 1726882171.23086: done dumping result, returning 10896 1726882171.23108: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-8b02-b216-00000000018e] 10896 1726882171.23118: sending task result for task 12673a56-9f93-8b02-b216-00000000018e ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882171.24143: no more pending results, returning what we have 10896 1726882171.24146: results queue empty 10896 1726882171.24147: checking for any_errors_fatal 10896 1726882171.24151: done checking for any_errors_fatal 10896 1726882171.24152: checking for max_fail_percentage 10896 1726882171.24154: done checking for max_fail_percentage 10896 1726882171.24154: checking to see if all hosts have failed and the running result is not ok 10896 1726882171.24155: done checking to see if all hosts have failed 10896 1726882171.24156: getting the remaining hosts for this loop 10896 1726882171.24157: done getting the remaining hosts for this loop 10896 1726882171.24160: getting the next task for host managed_node2 10896 1726882171.24165: done getting next task for host managed_node2 10896 1726882171.24168: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10896 1726882171.24172: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882171.24180: getting variables 10896 1726882171.24182: in VariableManager get_vars() 10896 1726882171.24215: Calling all_inventory to load vars for managed_node2 10896 1726882171.24218: Calling groups_inventory to load vars for managed_node2 10896 1726882171.24220: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882171.24229: Calling all_plugins_play to load vars for managed_node2 10896 1726882171.24231: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882171.24234: Calling groups_plugins_play to load vars for managed_node2 10896 1726882171.24809: done sending task result for task 12673a56-9f93-8b02-b216-00000000018e 10896 1726882171.24812: WORKER PROCESS EXITING 10896 1726882171.24874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882171.25360: done with get_vars() 10896 1726882171.25376: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:29:31 -0400 (0:00:01.953) 0:00:12.821 ****** 10896 1726882171.25471: entering _queue_task() for managed_node2/package_facts 10896 1726882171.25473: Creating lock for package_facts 10896 1726882171.25857: worker is 1 (out of 1 available) 10896 1726882171.25868: exiting _queue_task() for managed_node2/package_facts 10896 1726882171.25879: done queuing things up, now waiting for results queue to drain 10896 1726882171.25880: waiting for pending results... 10896 1726882171.26057: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10896 1726882171.26202: in run() - task 12673a56-9f93-8b02-b216-00000000018f 10896 1726882171.26226: variable 'ansible_search_path' from source: unknown 10896 1726882171.26234: variable 'ansible_search_path' from source: unknown 10896 1726882171.26277: calling self._execute() 10896 1726882171.26366: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882171.26379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882171.26392: variable 'omit' from source: magic vars 10896 1726882171.26769: variable 'ansible_distribution_major_version' from source: facts 10896 1726882171.26785: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882171.26800: variable 'omit' from source: magic vars 10896 1726882171.26878: variable 'omit' from source: magic vars 10896 1726882171.26924: variable 'omit' from source: magic vars 10896 1726882171.26965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882171.27012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882171.27039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882171.27060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882171.27076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882171.27129: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882171.27133: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882171.27135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882171.27238: Set connection var ansible_connection to ssh 10896 1726882171.27256: Set connection var ansible_timeout to 10 10896 1726882171.27264: Set connection var ansible_shell_type to sh 10896 1726882171.27277: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882171.27346: Set connection var ansible_shell_executable to /bin/sh 10896 1726882171.27350: Set connection var ansible_pipelining to False 10896 1726882171.27354: variable 'ansible_shell_executable' from source: unknown 10896 1726882171.27356: variable 'ansible_connection' from source: unknown 10896 1726882171.27359: variable 'ansible_module_compression' from source: unknown 10896 1726882171.27361: variable 'ansible_shell_type' from source: unknown 10896 1726882171.27363: variable 'ansible_shell_executable' from source: unknown 10896 1726882171.27365: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882171.27368: variable 'ansible_pipelining' from source: unknown 10896 1726882171.27370: variable 'ansible_timeout' from source: unknown 10896 1726882171.27372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882171.27571: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882171.27596: variable 'omit' from source: magic vars 10896 1726882171.27674: starting attempt loop 10896 1726882171.27677: running the handler 10896 1726882171.27680: _low_level_execute_command(): starting 10896 1726882171.27682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882171.28333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882171.28347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.28357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882171.28373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882171.28386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882171.28395: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882171.28434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.28438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882171.28441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882171.28443: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882171.28445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.28458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882171.28544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882171.28547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882171.28550: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882171.28552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.28565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882171.28578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882171.28604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882171.28695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.30276: stdout chunk (state=3): >>>/root <<< 10896 1726882171.30416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882171.30435: stderr chunk (state=3): >>><<< 10896 1726882171.30449: stdout chunk (state=3): >>><<< 10896 1726882171.30477: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882171.30574: _low_level_execute_command(): starting 10896 1726882171.30578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019 `" && echo ansible-tmp-1726882171.3048427-11591-88088339935019="` echo /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019 `" ) && sleep 0' 10896 1726882171.31244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882171.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.31249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882171.31252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882171.31254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882171.31256: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882171.31287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882171.31299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882171.31399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.33274: stdout chunk (state=3): >>>ansible-tmp-1726882171.3048427-11591-88088339935019=/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019 <<< 10896 1726882171.33361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882171.33391: stderr chunk (state=3): >>><<< 10896 1726882171.33398: stdout chunk (state=3): >>><<< 10896 1726882171.33410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882171.3048427-11591-88088339935019=/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882171.33448: variable 'ansible_module_compression' from source: unknown 10896 1726882171.33488: ANSIBALLZ: Using lock for package_facts 10896 1726882171.33492: ANSIBALLZ: Acquiring lock 10896 1726882171.33499: ANSIBALLZ: Lock acquired: 139646162069520 10896 1726882171.33501: ANSIBALLZ: Creating module 10896 1726882171.68302: ANSIBALLZ: Writing module into payload 10896 1726882171.68360: ANSIBALLZ: Writing module 10896 1726882171.68391: ANSIBALLZ: Renaming module 10896 1726882171.68409: ANSIBALLZ: Done creating module 10896 1726882171.68457: variable 'ansible_facts' from source: unknown 10896 1726882171.68672: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py 10896 1726882171.68879: Sending initial data 10896 1726882171.68883: Sent initial data (161 bytes) 10896 1726882171.69554: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882171.69570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.69585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882171.69648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.69714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882171.69730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882171.69760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882171.69976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.71646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882171.71928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882171.72000: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpnfn318nr /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py <<< 10896 1726882171.72008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py" <<< 10896 1726882171.72082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpnfn318nr" to remote "/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py" <<< 10896 1726882171.76205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882171.76209: stdout chunk (state=3): >>><<< 10896 1726882171.76212: stderr chunk (state=3): >>><<< 10896 1726882171.76215: done transferring module to remote 10896 1726882171.76217: _low_level_execute_command(): starting 10896 1726882171.76219: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/ /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py && sleep 0' 10896 1726882171.77261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882171.77347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.77439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882171.77476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882171.77578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882171.79713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882171.79731: stdout chunk (state=3): >>><<< 10896 1726882171.79734: stderr chunk (state=3): >>><<< 10896 1726882171.79737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882171.79739: _low_level_execute_command(): starting 10896 1726882171.79742: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/AnsiballZ_package_facts.py && sleep 0' 10896 1726882171.80849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.80853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.80856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882171.80858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882171.80860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882171.81132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882172.25596: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 10896 1726882172.25723: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10896 1726882172.27590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882172.27647: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 10896 1726882172.27690: stdout chunk (state=3): >>><<< 10896 1726882172.27704: stderr chunk (state=3): >>><<< 10896 1726882172.28104: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882172.31581: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882172.31678: _low_level_execute_command(): starting 10896 1726882172.31759: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882171.3048427-11591-88088339935019/ > /dev/null 2>&1 && sleep 0' 10896 1726882172.33035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882172.33050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882172.33141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882172.33218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882172.33241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882172.33256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882172.33354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882172.35569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882172.35573: stdout chunk (state=3): >>><<< 10896 1726882172.35575: stderr chunk (state=3): >>><<< 10896 1726882172.35578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882172.35580: handler run complete 10896 1726882172.36919: variable 'ansible_facts' from source: unknown 10896 1726882172.37364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.40912: variable 'ansible_facts' from source: unknown 10896 1726882172.41462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.42231: attempt loop complete, returning result 10896 1726882172.42241: _execute() done 10896 1726882172.42244: dumping result to json 10896 1726882172.42471: done dumping result, returning 10896 1726882172.42481: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-8b02-b216-00000000018f] 10896 1726882172.42486: sending task result for task 12673a56-9f93-8b02-b216-00000000018f 10896 1726882172.45667: done sending task result for task 12673a56-9f93-8b02-b216-00000000018f 10896 1726882172.45670: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882172.45770: no more pending results, returning what we have 10896 1726882172.45773: results queue empty 10896 1726882172.45773: checking for any_errors_fatal 10896 1726882172.45778: done checking for any_errors_fatal 10896 1726882172.45779: checking for max_fail_percentage 10896 1726882172.45780: done checking for max_fail_percentage 10896 1726882172.45781: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.45783: done checking to see if all hosts have failed 10896 1726882172.45784: getting the remaining hosts for this loop 10896 1726882172.45786: done getting the remaining hosts for this loop 10896 1726882172.45789: getting the next task for host managed_node2 10896 1726882172.45802: done getting next task for host managed_node2 10896 1726882172.45805: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10896 1726882172.45808: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.45818: getting variables 10896 1726882172.45819: in VariableManager get_vars() 10896 1726882172.45851: Calling all_inventory to load vars for managed_node2 10896 1726882172.45854: Calling groups_inventory to load vars for managed_node2 10896 1726882172.45856: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.45865: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.45868: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.45870: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.47088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.48745: done with get_vars() 10896 1726882172.48773: done getting variables 10896 1726882172.48848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:29:32 -0400 (0:00:01.234) 0:00:14.055 ****** 10896 1726882172.48886: entering _queue_task() for managed_node2/debug 10896 1726882172.49422: worker is 1 (out of 1 available) 10896 1726882172.49433: exiting _queue_task() for managed_node2/debug 10896 1726882172.49442: done queuing things up, now waiting for results queue to drain 10896 1726882172.49444: waiting for pending results... 10896 1726882172.49574: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 10896 1726882172.49681: in run() - task 12673a56-9f93-8b02-b216-000000000027 10896 1726882172.49781: variable 'ansible_search_path' from source: unknown 10896 1726882172.49784: variable 'ansible_search_path' from source: unknown 10896 1726882172.49786: calling self._execute() 10896 1726882172.49850: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.49862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.49875: variable 'omit' from source: magic vars 10896 1726882172.50247: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.50264: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.50276: variable 'omit' from source: magic vars 10896 1726882172.50336: variable 'omit' from source: magic vars 10896 1726882172.50444: variable 'network_provider' from source: set_fact 10896 1726882172.50468: variable 'omit' from source: magic vars 10896 1726882172.50516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882172.50563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882172.50588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882172.50653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882172.50657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882172.50668: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882172.50676: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.50683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.50791: Set connection var ansible_connection to ssh 10896 1726882172.50810: Set connection var ansible_timeout to 10 10896 1726882172.50817: Set connection var ansible_shell_type to sh 10896 1726882172.50872: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882172.50875: Set connection var ansible_shell_executable to /bin/sh 10896 1726882172.50878: Set connection var ansible_pipelining to False 10896 1726882172.50880: variable 'ansible_shell_executable' from source: unknown 10896 1726882172.50887: variable 'ansible_connection' from source: unknown 10896 1726882172.50899: variable 'ansible_module_compression' from source: unknown 10896 1726882172.50907: variable 'ansible_shell_type' from source: unknown 10896 1726882172.50914: variable 'ansible_shell_executable' from source: unknown 10896 1726882172.50920: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.50927: variable 'ansible_pipelining' from source: unknown 10896 1726882172.50980: variable 'ansible_timeout' from source: unknown 10896 1726882172.50983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.51099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882172.51116: variable 'omit' from source: magic vars 10896 1726882172.51127: starting attempt loop 10896 1726882172.51135: running the handler 10896 1726882172.51177: handler run complete 10896 1726882172.51203: attempt loop complete, returning result 10896 1726882172.51210: _execute() done 10896 1726882172.51299: dumping result to json 10896 1726882172.51307: done dumping result, returning 10896 1726882172.51309: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-8b02-b216-000000000027] 10896 1726882172.51311: sending task result for task 12673a56-9f93-8b02-b216-000000000027 10896 1726882172.51372: done sending task result for task 12673a56-9f93-8b02-b216-000000000027 10896 1726882172.51375: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 10896 1726882172.51459: no more pending results, returning what we have 10896 1726882172.51462: results queue empty 10896 1726882172.51463: checking for any_errors_fatal 10896 1726882172.51474: done checking for any_errors_fatal 10896 1726882172.51474: checking for max_fail_percentage 10896 1726882172.51476: done checking for max_fail_percentage 10896 1726882172.51477: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.51478: done checking to see if all hosts have failed 10896 1726882172.51478: getting the remaining hosts for this loop 10896 1726882172.51480: done getting the remaining hosts for this loop 10896 1726882172.51483: getting the next task for host managed_node2 10896 1726882172.51489: done getting next task for host managed_node2 10896 1726882172.51497: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10896 1726882172.51500: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.51510: getting variables 10896 1726882172.51512: in VariableManager get_vars() 10896 1726882172.51551: Calling all_inventory to load vars for managed_node2 10896 1726882172.51553: Calling groups_inventory to load vars for managed_node2 10896 1726882172.51555: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.51566: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.51568: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.51571: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.53245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.54922: done with get_vars() 10896 1726882172.54953: done getting variables 10896 1726882172.55061: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:29:32 -0400 (0:00:00.062) 0:00:14.117 ****** 10896 1726882172.55106: entering _queue_task() for managed_node2/fail 10896 1726882172.55108: Creating lock for fail 10896 1726882172.55451: worker is 1 (out of 1 available) 10896 1726882172.55464: exiting _queue_task() for managed_node2/fail 10896 1726882172.55476: done queuing things up, now waiting for results queue to drain 10896 1726882172.55478: waiting for pending results... 10896 1726882172.55846: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10896 1726882172.55919: in run() - task 12673a56-9f93-8b02-b216-000000000028 10896 1726882172.55925: variable 'ansible_search_path' from source: unknown 10896 1726882172.55942: variable 'ansible_search_path' from source: unknown 10896 1726882172.55978: calling self._execute() 10896 1726882172.56100: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.56104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.56107: variable 'omit' from source: magic vars 10896 1726882172.56489: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.56509: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.56702: variable 'network_state' from source: role '' defaults 10896 1726882172.56705: Evaluated conditional (network_state != {}): False 10896 1726882172.56708: when evaluation is False, skipping this task 10896 1726882172.56711: _execute() done 10896 1726882172.56713: dumping result to json 10896 1726882172.56715: done dumping result, returning 10896 1726882172.56718: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-8b02-b216-000000000028] 10896 1726882172.56720: sending task result for task 12673a56-9f93-8b02-b216-000000000028 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882172.56944: no more pending results, returning what we have 10896 1726882172.56948: results queue empty 10896 1726882172.56949: checking for any_errors_fatal 10896 1726882172.56955: done checking for any_errors_fatal 10896 1726882172.56956: checking for max_fail_percentage 10896 1726882172.56958: done checking for max_fail_percentage 10896 1726882172.56959: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.56960: done checking to see if all hosts have failed 10896 1726882172.56960: getting the remaining hosts for this loop 10896 1726882172.56962: done getting the remaining hosts for this loop 10896 1726882172.56965: getting the next task for host managed_node2 10896 1726882172.56972: done getting next task for host managed_node2 10896 1726882172.56975: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10896 1726882172.56979: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.56999: getting variables 10896 1726882172.57000: in VariableManager get_vars() 10896 1726882172.57043: Calling all_inventory to load vars for managed_node2 10896 1726882172.57045: Calling groups_inventory to load vars for managed_node2 10896 1726882172.57048: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.57060: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.57062: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.57065: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.57609: done sending task result for task 12673a56-9f93-8b02-b216-000000000028 10896 1726882172.57613: WORKER PROCESS EXITING 10896 1726882172.58574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.60363: done with get_vars() 10896 1726882172.60386: done getting variables 10896 1726882172.60456: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:29:32 -0400 (0:00:00.053) 0:00:14.171 ****** 10896 1726882172.60490: entering _queue_task() for managed_node2/fail 10896 1726882172.60823: worker is 1 (out of 1 available) 10896 1726882172.60835: exiting _queue_task() for managed_node2/fail 10896 1726882172.60848: done queuing things up, now waiting for results queue to drain 10896 1726882172.60849: waiting for pending results... 10896 1726882172.61136: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10896 1726882172.61267: in run() - task 12673a56-9f93-8b02-b216-000000000029 10896 1726882172.61286: variable 'ansible_search_path' from source: unknown 10896 1726882172.61299: variable 'ansible_search_path' from source: unknown 10896 1726882172.61344: calling self._execute() 10896 1726882172.61435: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.61446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.61460: variable 'omit' from source: magic vars 10896 1726882172.61840: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.61862: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.61996: variable 'network_state' from source: role '' defaults 10896 1726882172.62013: Evaluated conditional (network_state != {}): False 10896 1726882172.62021: when evaluation is False, skipping this task 10896 1726882172.62027: _execute() done 10896 1726882172.62034: dumping result to json 10896 1726882172.62041: done dumping result, returning 10896 1726882172.62051: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-8b02-b216-000000000029] 10896 1726882172.62060: sending task result for task 12673a56-9f93-8b02-b216-000000000029 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882172.62231: no more pending results, returning what we have 10896 1726882172.62235: results queue empty 10896 1726882172.62236: checking for any_errors_fatal 10896 1726882172.62244: done checking for any_errors_fatal 10896 1726882172.62245: checking for max_fail_percentage 10896 1726882172.62247: done checking for max_fail_percentage 10896 1726882172.62248: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.62249: done checking to see if all hosts have failed 10896 1726882172.62249: getting the remaining hosts for this loop 10896 1726882172.62251: done getting the remaining hosts for this loop 10896 1726882172.62255: getting the next task for host managed_node2 10896 1726882172.62261: done getting next task for host managed_node2 10896 1726882172.62265: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10896 1726882172.62268: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.62400: getting variables 10896 1726882172.62402: in VariableManager get_vars() 10896 1726882172.62447: Calling all_inventory to load vars for managed_node2 10896 1726882172.62450: Calling groups_inventory to load vars for managed_node2 10896 1726882172.62452: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.62499: done sending task result for task 12673a56-9f93-8b02-b216-000000000029 10896 1726882172.62504: WORKER PROCESS EXITING 10896 1726882172.62518: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.62522: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.62525: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.63950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.65522: done with get_vars() 10896 1726882172.65555: done getting variables 10896 1726882172.65624: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:29:32 -0400 (0:00:00.051) 0:00:14.223 ****** 10896 1726882172.65665: entering _queue_task() for managed_node2/fail 10896 1726882172.66120: worker is 1 (out of 1 available) 10896 1726882172.66130: exiting _queue_task() for managed_node2/fail 10896 1726882172.66141: done queuing things up, now waiting for results queue to drain 10896 1726882172.66142: waiting for pending results... 10896 1726882172.66327: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10896 1726882172.66467: in run() - task 12673a56-9f93-8b02-b216-00000000002a 10896 1726882172.66586: variable 'ansible_search_path' from source: unknown 10896 1726882172.66589: variable 'ansible_search_path' from source: unknown 10896 1726882172.66596: calling self._execute() 10896 1726882172.66643: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.66655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.66671: variable 'omit' from source: magic vars 10896 1726882172.67092: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.67115: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.67319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882172.69592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882172.69803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882172.69807: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882172.69809: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882172.69853: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882172.69963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.70000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.70037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.70082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.70107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.70232: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.70267: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10896 1726882172.70439: variable 'ansible_distribution' from source: facts 10896 1726882172.70452: variable '__network_rh_distros' from source: role '' defaults 10896 1726882172.70487: Evaluated conditional (ansible_distribution in __network_rh_distros): True 10896 1726882172.71102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.71108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.71141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.71270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.71353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.71500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.71515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.71565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.71757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.71779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.71915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.71939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.71965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.72132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.72135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.72581: variable 'network_connections' from source: task vars 10896 1726882172.72601: variable 'controller_profile' from source: play vars 10896 1726882172.72675: variable 'controller_profile' from source: play vars 10896 1726882172.72688: variable 'controller_device' from source: play vars 10896 1726882172.72755: variable 'controller_device' from source: play vars 10896 1726882172.72768: variable 'port1_profile' from source: play vars 10896 1726882172.72836: variable 'port1_profile' from source: play vars 10896 1726882172.72847: variable 'dhcp_interface1' from source: play vars 10896 1726882172.72918: variable 'dhcp_interface1' from source: play vars 10896 1726882172.72929: variable 'controller_profile' from source: play vars 10896 1726882172.73077: variable 'controller_profile' from source: play vars 10896 1726882172.73081: variable 'port2_profile' from source: play vars 10896 1726882172.73083: variable 'port2_profile' from source: play vars 10896 1726882172.73085: variable 'dhcp_interface2' from source: play vars 10896 1726882172.73141: variable 'dhcp_interface2' from source: play vars 10896 1726882172.73188: variable 'controller_profile' from source: play vars 10896 1726882172.73250: variable 'controller_profile' from source: play vars 10896 1726882172.73262: variable 'network_state' from source: role '' defaults 10896 1726882172.73339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882172.73602: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882172.73605: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882172.73607: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882172.73631: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882172.73700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882172.73732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882172.73765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.73799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882172.73927: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 10896 1726882172.73937: when evaluation is False, skipping this task 10896 1726882172.73944: _execute() done 10896 1726882172.73951: dumping result to json 10896 1726882172.74073: done dumping result, returning 10896 1726882172.74076: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-8b02-b216-00000000002a] 10896 1726882172.74079: sending task result for task 12673a56-9f93-8b02-b216-00000000002a skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 10896 1726882172.74323: no more pending results, returning what we have 10896 1726882172.74326: results queue empty 10896 1726882172.74327: checking for any_errors_fatal 10896 1726882172.74332: done checking for any_errors_fatal 10896 1726882172.74333: checking for max_fail_percentage 10896 1726882172.74335: done checking for max_fail_percentage 10896 1726882172.74335: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.74336: done checking to see if all hosts have failed 10896 1726882172.74337: getting the remaining hosts for this loop 10896 1726882172.74339: done getting the remaining hosts for this loop 10896 1726882172.74342: getting the next task for host managed_node2 10896 1726882172.74349: done getting next task for host managed_node2 10896 1726882172.74353: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10896 1726882172.74405: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.74421: getting variables 10896 1726882172.74423: in VariableManager get_vars() 10896 1726882172.74698: Calling all_inventory to load vars for managed_node2 10896 1726882172.74702: Calling groups_inventory to load vars for managed_node2 10896 1726882172.74705: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.74808: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.74812: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.74817: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.75501: done sending task result for task 12673a56-9f93-8b02-b216-00000000002a 10896 1726882172.75505: WORKER PROCESS EXITING 10896 1726882172.78939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.82459: done with get_vars() 10896 1726882172.82484: done getting variables 10896 1726882172.82583: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:29:32 -0400 (0:00:00.171) 0:00:14.395 ****** 10896 1726882172.82824: entering _queue_task() for managed_node2/dnf 10896 1726882172.83332: worker is 1 (out of 1 available) 10896 1726882172.83345: exiting _queue_task() for managed_node2/dnf 10896 1726882172.83356: done queuing things up, now waiting for results queue to drain 10896 1726882172.83358: waiting for pending results... 10896 1726882172.83817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10896 1726882172.84108: in run() - task 12673a56-9f93-8b02-b216-00000000002b 10896 1726882172.84206: variable 'ansible_search_path' from source: unknown 10896 1726882172.84215: variable 'ansible_search_path' from source: unknown 10896 1726882172.84252: calling self._execute() 10896 1726882172.84333: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.84344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.84354: variable 'omit' from source: magic vars 10896 1726882172.84823: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.84839: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.85055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882172.87437: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882172.87511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882172.87554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882172.87600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882172.87632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882172.87718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.87749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.87782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.87833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.87854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.87981: variable 'ansible_distribution' from source: facts 10896 1726882172.87997: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.88101: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10896 1726882172.88134: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882172.88268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.88303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.88338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.88381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.88406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.88454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.88481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.88515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.88564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.88583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.88631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.88663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.88690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.88736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.88858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.88918: variable 'network_connections' from source: task vars 10896 1726882172.88936: variable 'controller_profile' from source: play vars 10896 1726882172.89006: variable 'controller_profile' from source: play vars 10896 1726882172.89020: variable 'controller_device' from source: play vars 10896 1726882172.89083: variable 'controller_device' from source: play vars 10896 1726882172.89103: variable 'port1_profile' from source: play vars 10896 1726882172.89163: variable 'port1_profile' from source: play vars 10896 1726882172.89174: variable 'dhcp_interface1' from source: play vars 10896 1726882172.89244: variable 'dhcp_interface1' from source: play vars 10896 1726882172.89255: variable 'controller_profile' from source: play vars 10896 1726882172.89326: variable 'controller_profile' from source: play vars 10896 1726882172.89338: variable 'port2_profile' from source: play vars 10896 1726882172.89401: variable 'port2_profile' from source: play vars 10896 1726882172.89502: variable 'dhcp_interface2' from source: play vars 10896 1726882172.89504: variable 'dhcp_interface2' from source: play vars 10896 1726882172.89507: variable 'controller_profile' from source: play vars 10896 1726882172.89551: variable 'controller_profile' from source: play vars 10896 1726882172.89653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882172.89821: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882172.89864: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882172.89901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882172.89935: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882172.89983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882172.90056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882172.90060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.90080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882172.90146: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882172.90389: variable 'network_connections' from source: task vars 10896 1726882172.90404: variable 'controller_profile' from source: play vars 10896 1726882172.90464: variable 'controller_profile' from source: play vars 10896 1726882172.90474: variable 'controller_device' from source: play vars 10896 1726882172.90601: variable 'controller_device' from source: play vars 10896 1726882172.90604: variable 'port1_profile' from source: play vars 10896 1726882172.90606: variable 'port1_profile' from source: play vars 10896 1726882172.90616: variable 'dhcp_interface1' from source: play vars 10896 1726882172.90675: variable 'dhcp_interface1' from source: play vars 10896 1726882172.90687: variable 'controller_profile' from source: play vars 10896 1726882172.90755: variable 'controller_profile' from source: play vars 10896 1726882172.90767: variable 'port2_profile' from source: play vars 10896 1726882172.90838: variable 'port2_profile' from source: play vars 10896 1726882172.90851: variable 'dhcp_interface2' from source: play vars 10896 1726882172.90916: variable 'dhcp_interface2' from source: play vars 10896 1726882172.90934: variable 'controller_profile' from source: play vars 10896 1726882172.90999: variable 'controller_profile' from source: play vars 10896 1726882172.91041: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882172.91098: when evaluation is False, skipping this task 10896 1726882172.91101: _execute() done 10896 1726882172.91104: dumping result to json 10896 1726882172.91106: done dumping result, returning 10896 1726882172.91108: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-8b02-b216-00000000002b] 10896 1726882172.91111: sending task result for task 12673a56-9f93-8b02-b216-00000000002b 10896 1726882172.91401: done sending task result for task 12673a56-9f93-8b02-b216-00000000002b 10896 1726882172.91405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882172.91453: no more pending results, returning what we have 10896 1726882172.91455: results queue empty 10896 1726882172.91456: checking for any_errors_fatal 10896 1726882172.91463: done checking for any_errors_fatal 10896 1726882172.91463: checking for max_fail_percentage 10896 1726882172.91465: done checking for max_fail_percentage 10896 1726882172.91466: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.91466: done checking to see if all hosts have failed 10896 1726882172.91467: getting the remaining hosts for this loop 10896 1726882172.91468: done getting the remaining hosts for this loop 10896 1726882172.91472: getting the next task for host managed_node2 10896 1726882172.91478: done getting next task for host managed_node2 10896 1726882172.91481: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10896 1726882172.91484: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.91502: getting variables 10896 1726882172.91503: in VariableManager get_vars() 10896 1726882172.91547: Calling all_inventory to load vars for managed_node2 10896 1726882172.91550: Calling groups_inventory to load vars for managed_node2 10896 1726882172.91553: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.91563: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.91566: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.91569: Calling groups_plugins_play to load vars for managed_node2 10896 1726882172.92952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882172.94498: done with get_vars() 10896 1726882172.94520: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10896 1726882172.94598: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:29:32 -0400 (0:00:00.118) 0:00:14.513 ****** 10896 1726882172.94629: entering _queue_task() for managed_node2/yum 10896 1726882172.94631: Creating lock for yum 10896 1726882172.95114: worker is 1 (out of 1 available) 10896 1726882172.95123: exiting _queue_task() for managed_node2/yum 10896 1726882172.95134: done queuing things up, now waiting for results queue to drain 10896 1726882172.95135: waiting for pending results... 10896 1726882172.95208: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10896 1726882172.95329: in run() - task 12673a56-9f93-8b02-b216-00000000002c 10896 1726882172.95347: variable 'ansible_search_path' from source: unknown 10896 1726882172.95358: variable 'ansible_search_path' from source: unknown 10896 1726882172.95399: calling self._execute() 10896 1726882172.95483: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882172.95499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882172.95515: variable 'omit' from source: magic vars 10896 1726882172.95874: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.95890: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882172.96069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882172.98662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882172.98740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882172.98782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882172.98830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882172.98861: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882172.98947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882172.98981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882172.99016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882172.99064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882172.99082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882172.99177: variable 'ansible_distribution_major_version' from source: facts 10896 1726882172.99262: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10896 1726882172.99265: when evaluation is False, skipping this task 10896 1726882172.99267: _execute() done 10896 1726882172.99269: dumping result to json 10896 1726882172.99271: done dumping result, returning 10896 1726882172.99274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-8b02-b216-00000000002c] 10896 1726882172.99276: sending task result for task 12673a56-9f93-8b02-b216-00000000002c 10896 1726882172.99352: done sending task result for task 12673a56-9f93-8b02-b216-00000000002c 10896 1726882172.99355: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10896 1726882172.99422: no more pending results, returning what we have 10896 1726882172.99426: results queue empty 10896 1726882172.99427: checking for any_errors_fatal 10896 1726882172.99431: done checking for any_errors_fatal 10896 1726882172.99432: checking for max_fail_percentage 10896 1726882172.99433: done checking for max_fail_percentage 10896 1726882172.99434: checking to see if all hosts have failed and the running result is not ok 10896 1726882172.99435: done checking to see if all hosts have failed 10896 1726882172.99436: getting the remaining hosts for this loop 10896 1726882172.99437: done getting the remaining hosts for this loop 10896 1726882172.99441: getting the next task for host managed_node2 10896 1726882172.99447: done getting next task for host managed_node2 10896 1726882172.99451: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10896 1726882172.99453: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882172.99467: getting variables 10896 1726882172.99468: in VariableManager get_vars() 10896 1726882172.99517: Calling all_inventory to load vars for managed_node2 10896 1726882172.99520: Calling groups_inventory to load vars for managed_node2 10896 1726882172.99523: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882172.99533: Calling all_plugins_play to load vars for managed_node2 10896 1726882172.99535: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882172.99538: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.02680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.06148: done with get_vars() 10896 1726882173.06183: done getting variables 10896 1726882173.06286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:29:33 -0400 (0:00:00.116) 0:00:14.630 ****** 10896 1726882173.06323: entering _queue_task() for managed_node2/fail 10896 1726882173.07065: worker is 1 (out of 1 available) 10896 1726882173.07078: exiting _queue_task() for managed_node2/fail 10896 1726882173.07091: done queuing things up, now waiting for results queue to drain 10896 1726882173.07397: waiting for pending results... 10896 1726882173.07940: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10896 1726882173.08056: in run() - task 12673a56-9f93-8b02-b216-00000000002d 10896 1726882173.08116: variable 'ansible_search_path' from source: unknown 10896 1726882173.08125: variable 'ansible_search_path' from source: unknown 10896 1726882173.08259: calling self._execute() 10896 1726882173.08330: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.08343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.08356: variable 'omit' from source: magic vars 10896 1726882173.09210: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.09215: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.09376: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.09757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882173.14407: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882173.14682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882173.14685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882173.14688: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882173.14810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882173.14899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.15201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.15204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.15207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.15222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.15274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.15363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.15392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.15487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.15663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.15666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.15669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.15671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.15810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.15990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.16191: variable 'network_connections' from source: task vars 10896 1726882173.16318: variable 'controller_profile' from source: play vars 10896 1726882173.16399: variable 'controller_profile' from source: play vars 10896 1726882173.16701: variable 'controller_device' from source: play vars 10896 1726882173.16704: variable 'controller_device' from source: play vars 10896 1726882173.16707: variable 'port1_profile' from source: play vars 10896 1726882173.16749: variable 'port1_profile' from source: play vars 10896 1726882173.16762: variable 'dhcp_interface1' from source: play vars 10896 1726882173.16940: variable 'dhcp_interface1' from source: play vars 10896 1726882173.16951: variable 'controller_profile' from source: play vars 10896 1726882173.17014: variable 'controller_profile' from source: play vars 10896 1726882173.17035: variable 'port2_profile' from source: play vars 10896 1726882173.17160: variable 'port2_profile' from source: play vars 10896 1726882173.17254: variable 'dhcp_interface2' from source: play vars 10896 1726882173.17318: variable 'dhcp_interface2' from source: play vars 10896 1726882173.17329: variable 'controller_profile' from source: play vars 10896 1726882173.17412: variable 'controller_profile' from source: play vars 10896 1726882173.17566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882173.17990: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882173.18200: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882173.18203: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882173.18231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882173.18338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882173.18558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882173.18561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.18564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882173.18687: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882173.19159: variable 'network_connections' from source: task vars 10896 1726882173.19171: variable 'controller_profile' from source: play vars 10896 1726882173.19343: variable 'controller_profile' from source: play vars 10896 1726882173.19361: variable 'controller_device' from source: play vars 10896 1726882173.19432: variable 'controller_device' from source: play vars 10896 1726882173.19447: variable 'port1_profile' from source: play vars 10896 1726882173.19512: variable 'port1_profile' from source: play vars 10896 1726882173.19525: variable 'dhcp_interface1' from source: play vars 10896 1726882173.19598: variable 'dhcp_interface1' from source: play vars 10896 1726882173.19611: variable 'controller_profile' from source: play vars 10896 1726882173.19675: variable 'controller_profile' from source: play vars 10896 1726882173.19686: variable 'port2_profile' from source: play vars 10896 1726882173.19750: variable 'port2_profile' from source: play vars 10896 1726882173.19765: variable 'dhcp_interface2' from source: play vars 10896 1726882173.19828: variable 'dhcp_interface2' from source: play vars 10896 1726882173.19839: variable 'controller_profile' from source: play vars 10896 1726882173.19908: variable 'controller_profile' from source: play vars 10896 1726882173.19945: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882173.19953: when evaluation is False, skipping this task 10896 1726882173.19959: _execute() done 10896 1726882173.19966: dumping result to json 10896 1726882173.19977: done dumping result, returning 10896 1726882173.19989: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-8b02-b216-00000000002d] 10896 1726882173.20004: sending task result for task 12673a56-9f93-8b02-b216-00000000002d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882173.20273: no more pending results, returning what we have 10896 1726882173.20277: results queue empty 10896 1726882173.20278: checking for any_errors_fatal 10896 1726882173.20282: done checking for any_errors_fatal 10896 1726882173.20283: checking for max_fail_percentage 10896 1726882173.20285: done checking for max_fail_percentage 10896 1726882173.20286: checking to see if all hosts have failed and the running result is not ok 10896 1726882173.20287: done checking to see if all hosts have failed 10896 1726882173.20287: getting the remaining hosts for this loop 10896 1726882173.20289: done getting the remaining hosts for this loop 10896 1726882173.20296: getting the next task for host managed_node2 10896 1726882173.20304: done getting next task for host managed_node2 10896 1726882173.20308: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10896 1726882173.20312: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882173.20326: getting variables 10896 1726882173.20328: in VariableManager get_vars() 10896 1726882173.20373: Calling all_inventory to load vars for managed_node2 10896 1726882173.20376: Calling groups_inventory to load vars for managed_node2 10896 1726882173.20378: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882173.20389: Calling all_plugins_play to load vars for managed_node2 10896 1726882173.20392: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882173.20603: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.21309: done sending task result for task 12673a56-9f93-8b02-b216-00000000002d 10896 1726882173.21313: WORKER PROCESS EXITING 10896 1726882173.22279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.24140: done with get_vars() 10896 1726882173.24168: done getting variables 10896 1726882173.24233: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:29:33 -0400 (0:00:00.179) 0:00:14.809 ****** 10896 1726882173.24266: entering _queue_task() for managed_node2/package 10896 1726882173.24579: worker is 1 (out of 1 available) 10896 1726882173.24591: exiting _queue_task() for managed_node2/package 10896 1726882173.24607: done queuing things up, now waiting for results queue to drain 10896 1726882173.24609: waiting for pending results... 10896 1726882173.25044: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 10896 1726882173.25411: in run() - task 12673a56-9f93-8b02-b216-00000000002e 10896 1726882173.25431: variable 'ansible_search_path' from source: unknown 10896 1726882173.25440: variable 'ansible_search_path' from source: unknown 10896 1726882173.25481: calling self._execute() 10896 1726882173.25574: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.25708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.25729: variable 'omit' from source: magic vars 10896 1726882173.26356: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.26505: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.26864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882173.27486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882173.27536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882173.27633: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882173.27813: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882173.28078: variable 'network_packages' from source: role '' defaults 10896 1726882173.28347: variable '__network_provider_setup' from source: role '' defaults 10896 1726882173.28350: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882173.28519: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882173.28573: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882173.28728: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882173.29260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882173.33692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882173.33783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882173.33868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882173.33914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882173.33950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882173.34049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.34083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.34120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.34172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.34190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.34245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.34278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.34315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.34360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.34385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.34676: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10896 1726882173.34807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.34837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.34872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.35033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.35200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.35252: variable 'ansible_python' from source: facts 10896 1726882173.35334: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10896 1726882173.35475: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882173.35598: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882173.35730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.35782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.35815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.35863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.35887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.35939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.35981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.36018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.36067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.36091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.36363: variable 'network_connections' from source: task vars 10896 1726882173.36406: variable 'controller_profile' from source: play vars 10896 1726882173.36603: variable 'controller_profile' from source: play vars 10896 1726882173.36819: variable 'controller_device' from source: play vars 10896 1726882173.36906: variable 'controller_device' from source: play vars 10896 1726882173.36931: variable 'port1_profile' from source: play vars 10896 1726882173.37116: variable 'port1_profile' from source: play vars 10896 1726882173.37374: variable 'dhcp_interface1' from source: play vars 10896 1726882173.37439: variable 'dhcp_interface1' from source: play vars 10896 1726882173.37453: variable 'controller_profile' from source: play vars 10896 1726882173.37560: variable 'controller_profile' from source: play vars 10896 1726882173.37574: variable 'port2_profile' from source: play vars 10896 1726882173.37684: variable 'port2_profile' from source: play vars 10896 1726882173.37712: variable 'dhcp_interface2' from source: play vars 10896 1726882173.37823: variable 'dhcp_interface2' from source: play vars 10896 1726882173.37838: variable 'controller_profile' from source: play vars 10896 1726882173.37945: variable 'controller_profile' from source: play vars 10896 1726882173.38037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882173.38072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882173.38110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.38154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882173.38212: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.38515: variable 'network_connections' from source: task vars 10896 1726882173.38525: variable 'controller_profile' from source: play vars 10896 1726882173.38633: variable 'controller_profile' from source: play vars 10896 1726882173.38648: variable 'controller_device' from source: play vars 10896 1726882173.38753: variable 'controller_device' from source: play vars 10896 1726882173.38768: variable 'port1_profile' from source: play vars 10896 1726882173.38873: variable 'port1_profile' from source: play vars 10896 1726882173.38887: variable 'dhcp_interface1' from source: play vars 10896 1726882173.38988: variable 'dhcp_interface1' from source: play vars 10896 1726882173.39014: variable 'controller_profile' from source: play vars 10896 1726882173.39120: variable 'controller_profile' from source: play vars 10896 1726882173.39135: variable 'port2_profile' from source: play vars 10896 1726882173.39243: variable 'port2_profile' from source: play vars 10896 1726882173.39259: variable 'dhcp_interface2' from source: play vars 10896 1726882173.39447: variable 'dhcp_interface2' from source: play vars 10896 1726882173.39451: variable 'controller_profile' from source: play vars 10896 1726882173.39500: variable 'controller_profile' from source: play vars 10896 1726882173.39572: variable '__network_packages_default_wireless' from source: role '' defaults 10896 1726882173.39666: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.40007: variable 'network_connections' from source: task vars 10896 1726882173.40018: variable 'controller_profile' from source: play vars 10896 1726882173.40082: variable 'controller_profile' from source: play vars 10896 1726882173.40101: variable 'controller_device' from source: play vars 10896 1726882173.40165: variable 'controller_device' from source: play vars 10896 1726882173.40179: variable 'port1_profile' from source: play vars 10896 1726882173.40252: variable 'port1_profile' from source: play vars 10896 1726882173.40263: variable 'dhcp_interface1' from source: play vars 10896 1726882173.40429: variable 'dhcp_interface1' from source: play vars 10896 1726882173.40432: variable 'controller_profile' from source: play vars 10896 1726882173.40434: variable 'controller_profile' from source: play vars 10896 1726882173.40436: variable 'port2_profile' from source: play vars 10896 1726882173.40485: variable 'port2_profile' from source: play vars 10896 1726882173.40500: variable 'dhcp_interface2' from source: play vars 10896 1726882173.40571: variable 'dhcp_interface2' from source: play vars 10896 1726882173.40582: variable 'controller_profile' from source: play vars 10896 1726882173.40663: variable 'controller_profile' from source: play vars 10896 1726882173.40696: variable '__network_packages_default_team' from source: role '' defaults 10896 1726882173.40812: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882173.41156: variable 'network_connections' from source: task vars 10896 1726882173.41168: variable 'controller_profile' from source: play vars 10896 1726882173.41245: variable 'controller_profile' from source: play vars 10896 1726882173.41257: variable 'controller_device' from source: play vars 10896 1726882173.41328: variable 'controller_device' from source: play vars 10896 1726882173.41341: variable 'port1_profile' from source: play vars 10896 1726882173.41417: variable 'port1_profile' from source: play vars 10896 1726882173.41428: variable 'dhcp_interface1' from source: play vars 10896 1726882173.41520: variable 'dhcp_interface1' from source: play vars 10896 1726882173.41523: variable 'controller_profile' from source: play vars 10896 1726882173.41571: variable 'controller_profile' from source: play vars 10896 1726882173.41583: variable 'port2_profile' from source: play vars 10896 1726882173.41680: variable 'port2_profile' from source: play vars 10896 1726882173.41714: variable 'dhcp_interface2' from source: play vars 10896 1726882173.41781: variable 'dhcp_interface2' from source: play vars 10896 1726882173.41827: variable 'controller_profile' from source: play vars 10896 1726882173.41897: variable 'controller_profile' from source: play vars 10896 1726882173.41963: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882173.42010: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882173.42015: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882173.42064: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882173.42217: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10896 1726882173.42526: variable 'network_connections' from source: task vars 10896 1726882173.42530: variable 'controller_profile' from source: play vars 10896 1726882173.42571: variable 'controller_profile' from source: play vars 10896 1726882173.42579: variable 'controller_device' from source: play vars 10896 1726882173.42623: variable 'controller_device' from source: play vars 10896 1726882173.42631: variable 'port1_profile' from source: play vars 10896 1726882173.42672: variable 'port1_profile' from source: play vars 10896 1726882173.42678: variable 'dhcp_interface1' from source: play vars 10896 1726882173.42723: variable 'dhcp_interface1' from source: play vars 10896 1726882173.42728: variable 'controller_profile' from source: play vars 10896 1726882173.42767: variable 'controller_profile' from source: play vars 10896 1726882173.42774: variable 'port2_profile' from source: play vars 10896 1726882173.42817: variable 'port2_profile' from source: play vars 10896 1726882173.42825: variable 'dhcp_interface2' from source: play vars 10896 1726882173.42867: variable 'dhcp_interface2' from source: play vars 10896 1726882173.42872: variable 'controller_profile' from source: play vars 10896 1726882173.42916: variable 'controller_profile' from source: play vars 10896 1726882173.42923: variable 'ansible_distribution' from source: facts 10896 1726882173.42925: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.42931: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.42953: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10896 1726882173.43058: variable 'ansible_distribution' from source: facts 10896 1726882173.43062: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.43067: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.43078: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10896 1726882173.43185: variable 'ansible_distribution' from source: facts 10896 1726882173.43188: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.43194: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.43223: variable 'network_provider' from source: set_fact 10896 1726882173.43236: variable 'ansible_facts' from source: unknown 10896 1726882173.43603: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10896 1726882173.43606: when evaluation is False, skipping this task 10896 1726882173.43609: _execute() done 10896 1726882173.43611: dumping result to json 10896 1726882173.43613: done dumping result, returning 10896 1726882173.43622: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-8b02-b216-00000000002e] 10896 1726882173.43627: sending task result for task 12673a56-9f93-8b02-b216-00000000002e 10896 1726882173.43956: done sending task result for task 12673a56-9f93-8b02-b216-00000000002e 10896 1726882173.43958: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10896 1726882173.44005: no more pending results, returning what we have 10896 1726882173.44008: results queue empty 10896 1726882173.44009: checking for any_errors_fatal 10896 1726882173.44013: done checking for any_errors_fatal 10896 1726882173.44014: checking for max_fail_percentage 10896 1726882173.44015: done checking for max_fail_percentage 10896 1726882173.44016: checking to see if all hosts have failed and the running result is not ok 10896 1726882173.44017: done checking to see if all hosts have failed 10896 1726882173.44017: getting the remaining hosts for this loop 10896 1726882173.44019: done getting the remaining hosts for this loop 10896 1726882173.44023: getting the next task for host managed_node2 10896 1726882173.44028: done getting next task for host managed_node2 10896 1726882173.44032: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10896 1726882173.44035: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882173.44048: getting variables 10896 1726882173.44049: in VariableManager get_vars() 10896 1726882173.44095: Calling all_inventory to load vars for managed_node2 10896 1726882173.44121: Calling groups_inventory to load vars for managed_node2 10896 1726882173.44124: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882173.44132: Calling all_plugins_play to load vars for managed_node2 10896 1726882173.44135: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882173.44137: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.46271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.47352: done with get_vars() 10896 1726882173.47371: done getting variables 10896 1726882173.47420: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:29:33 -0400 (0:00:00.231) 0:00:15.041 ****** 10896 1726882173.47446: entering _queue_task() for managed_node2/package 10896 1726882173.47704: worker is 1 (out of 1 available) 10896 1726882173.47717: exiting _queue_task() for managed_node2/package 10896 1726882173.47730: done queuing things up, now waiting for results queue to drain 10896 1726882173.47731: waiting for pending results... 10896 1726882173.47904: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10896 1726882173.47985: in run() - task 12673a56-9f93-8b02-b216-00000000002f 10896 1726882173.48001: variable 'ansible_search_path' from source: unknown 10896 1726882173.48009: variable 'ansible_search_path' from source: unknown 10896 1726882173.48039: calling self._execute() 10896 1726882173.48113: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.48117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.48126: variable 'omit' from source: magic vars 10896 1726882173.48399: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.48404: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.48483: variable 'network_state' from source: role '' defaults 10896 1726882173.48490: Evaluated conditional (network_state != {}): False 10896 1726882173.48498: when evaluation is False, skipping this task 10896 1726882173.48502: _execute() done 10896 1726882173.48505: dumping result to json 10896 1726882173.48507: done dumping result, returning 10896 1726882173.48510: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-8b02-b216-00000000002f] 10896 1726882173.48522: sending task result for task 12673a56-9f93-8b02-b216-00000000002f 10896 1726882173.48604: done sending task result for task 12673a56-9f93-8b02-b216-00000000002f 10896 1726882173.48607: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882173.48665: no more pending results, returning what we have 10896 1726882173.48668: results queue empty 10896 1726882173.48669: checking for any_errors_fatal 10896 1726882173.48673: done checking for any_errors_fatal 10896 1726882173.48674: checking for max_fail_percentage 10896 1726882173.48675: done checking for max_fail_percentage 10896 1726882173.48676: checking to see if all hosts have failed and the running result is not ok 10896 1726882173.48677: done checking to see if all hosts have failed 10896 1726882173.48678: getting the remaining hosts for this loop 10896 1726882173.48679: done getting the remaining hosts for this loop 10896 1726882173.48682: getting the next task for host managed_node2 10896 1726882173.48688: done getting next task for host managed_node2 10896 1726882173.48692: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10896 1726882173.48698: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882173.48713: getting variables 10896 1726882173.48714: in VariableManager get_vars() 10896 1726882173.48752: Calling all_inventory to load vars for managed_node2 10896 1726882173.48754: Calling groups_inventory to load vars for managed_node2 10896 1726882173.48756: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882173.48764: Calling all_plugins_play to load vars for managed_node2 10896 1726882173.48767: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882173.48769: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.49806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.53756: done with get_vars() 10896 1726882173.53777: done getting variables 10896 1726882173.53846: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:29:33 -0400 (0:00:00.064) 0:00:15.105 ****** 10896 1726882173.53882: entering _queue_task() for managed_node2/package 10896 1726882173.54227: worker is 1 (out of 1 available) 10896 1726882173.54244: exiting _queue_task() for managed_node2/package 10896 1726882173.54257: done queuing things up, now waiting for results queue to drain 10896 1726882173.54259: waiting for pending results... 10896 1726882173.54489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10896 1726882173.54657: in run() - task 12673a56-9f93-8b02-b216-000000000030 10896 1726882173.54660: variable 'ansible_search_path' from source: unknown 10896 1726882173.54663: variable 'ansible_search_path' from source: unknown 10896 1726882173.54671: calling self._execute() 10896 1726882173.54752: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.54765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.54772: variable 'omit' from source: magic vars 10896 1726882173.55080: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.55083: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.55236: variable 'network_state' from source: role '' defaults 10896 1726882173.55240: Evaluated conditional (network_state != {}): False 10896 1726882173.55243: when evaluation is False, skipping this task 10896 1726882173.55245: _execute() done 10896 1726882173.55248: dumping result to json 10896 1726882173.55250: done dumping result, returning 10896 1726882173.55253: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-8b02-b216-000000000030] 10896 1726882173.55289: sending task result for task 12673a56-9f93-8b02-b216-000000000030 10896 1726882173.55412: done sending task result for task 12673a56-9f93-8b02-b216-000000000030 10896 1726882173.55416: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882173.55479: no more pending results, returning what we have 10896 1726882173.55482: results queue empty 10896 1726882173.55483: checking for any_errors_fatal 10896 1726882173.55496: done checking for any_errors_fatal 10896 1726882173.55497: checking for max_fail_percentage 10896 1726882173.55500: done checking for max_fail_percentage 10896 1726882173.55501: checking to see if all hosts have failed and the running result is not ok 10896 1726882173.55502: done checking to see if all hosts have failed 10896 1726882173.55502: getting the remaining hosts for this loop 10896 1726882173.55504: done getting the remaining hosts for this loop 10896 1726882173.55510: getting the next task for host managed_node2 10896 1726882173.55516: done getting next task for host managed_node2 10896 1726882173.55521: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10896 1726882173.55523: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882173.55538: getting variables 10896 1726882173.55539: in VariableManager get_vars() 10896 1726882173.55580: Calling all_inventory to load vars for managed_node2 10896 1726882173.55583: Calling groups_inventory to load vars for managed_node2 10896 1726882173.55585: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882173.55622: Calling all_plugins_play to load vars for managed_node2 10896 1726882173.55626: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882173.55630: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.56834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.58135: done with get_vars() 10896 1726882173.58154: done getting variables 10896 1726882173.58241: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:29:33 -0400 (0:00:00.043) 0:00:15.149 ****** 10896 1726882173.58266: entering _queue_task() for managed_node2/service 10896 1726882173.58267: Creating lock for service 10896 1726882173.58584: worker is 1 (out of 1 available) 10896 1726882173.58599: exiting _queue_task() for managed_node2/service 10896 1726882173.58613: done queuing things up, now waiting for results queue to drain 10896 1726882173.58614: waiting for pending results... 10896 1726882173.58823: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10896 1726882173.58937: in run() - task 12673a56-9f93-8b02-b216-000000000031 10896 1726882173.58949: variable 'ansible_search_path' from source: unknown 10896 1726882173.58953: variable 'ansible_search_path' from source: unknown 10896 1726882173.59008: calling self._execute() 10896 1726882173.59080: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.59084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.59094: variable 'omit' from source: magic vars 10896 1726882173.59444: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.59453: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.59578: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.59742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882173.61910: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882173.61914: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882173.61917: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882173.61952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882173.61978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882173.62059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.62088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.62116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.62156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.62170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.62221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.62244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.62269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.62312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.62325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.62364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.62385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.62411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.62452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.62462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.62655: variable 'network_connections' from source: task vars 10896 1726882173.62667: variable 'controller_profile' from source: play vars 10896 1726882173.62741: variable 'controller_profile' from source: play vars 10896 1726882173.62751: variable 'controller_device' from source: play vars 10896 1726882173.63106: variable 'controller_device' from source: play vars 10896 1726882173.63110: variable 'port1_profile' from source: play vars 10896 1726882173.63113: variable 'port1_profile' from source: play vars 10896 1726882173.63115: variable 'dhcp_interface1' from source: play vars 10896 1726882173.63117: variable 'dhcp_interface1' from source: play vars 10896 1726882173.63119: variable 'controller_profile' from source: play vars 10896 1726882173.63121: variable 'controller_profile' from source: play vars 10896 1726882173.63123: variable 'port2_profile' from source: play vars 10896 1726882173.63125: variable 'port2_profile' from source: play vars 10896 1726882173.63127: variable 'dhcp_interface2' from source: play vars 10896 1726882173.63129: variable 'dhcp_interface2' from source: play vars 10896 1726882173.63131: variable 'controller_profile' from source: play vars 10896 1726882173.63192: variable 'controller_profile' from source: play vars 10896 1726882173.63268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882173.63646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882173.63674: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882173.63701: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882173.63723: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882173.63754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882173.63772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882173.63791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.63813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882173.63860: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882173.64116: variable 'network_connections' from source: task vars 10896 1726882173.64120: variable 'controller_profile' from source: play vars 10896 1726882173.64164: variable 'controller_profile' from source: play vars 10896 1726882173.64398: variable 'controller_device' from source: play vars 10896 1726882173.64402: variable 'controller_device' from source: play vars 10896 1726882173.64404: variable 'port1_profile' from source: play vars 10896 1726882173.64407: variable 'port1_profile' from source: play vars 10896 1726882173.64409: variable 'dhcp_interface1' from source: play vars 10896 1726882173.64411: variable 'dhcp_interface1' from source: play vars 10896 1726882173.64413: variable 'controller_profile' from source: play vars 10896 1726882173.64453: variable 'controller_profile' from source: play vars 10896 1726882173.64465: variable 'port2_profile' from source: play vars 10896 1726882173.64528: variable 'port2_profile' from source: play vars 10896 1726882173.64536: variable 'dhcp_interface2' from source: play vars 10896 1726882173.64595: variable 'dhcp_interface2' from source: play vars 10896 1726882173.64619: variable 'controller_profile' from source: play vars 10896 1726882173.64699: variable 'controller_profile' from source: play vars 10896 1726882173.64763: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882173.64770: when evaluation is False, skipping this task 10896 1726882173.64773: _execute() done 10896 1726882173.64799: dumping result to json 10896 1726882173.64804: done dumping result, returning 10896 1726882173.64843: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-8b02-b216-000000000031] 10896 1726882173.64846: sending task result for task 12673a56-9f93-8b02-b216-000000000031 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882173.64978: no more pending results, returning what we have 10896 1726882173.64981: results queue empty 10896 1726882173.64982: checking for any_errors_fatal 10896 1726882173.64995: done checking for any_errors_fatal 10896 1726882173.64996: checking for max_fail_percentage 10896 1726882173.64998: done checking for max_fail_percentage 10896 1726882173.64999: checking to see if all hosts have failed and the running result is not ok 10896 1726882173.65000: done checking to see if all hosts have failed 10896 1726882173.65000: getting the remaining hosts for this loop 10896 1726882173.65002: done getting the remaining hosts for this loop 10896 1726882173.65005: getting the next task for host managed_node2 10896 1726882173.65012: done getting next task for host managed_node2 10896 1726882173.65015: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10896 1726882173.65018: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882173.65030: getting variables 10896 1726882173.65032: in VariableManager get_vars() 10896 1726882173.65071: Calling all_inventory to load vars for managed_node2 10896 1726882173.65077: Calling groups_inventory to load vars for managed_node2 10896 1726882173.65080: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882173.65090: Calling all_plugins_play to load vars for managed_node2 10896 1726882173.65117: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882173.65125: Calling groups_plugins_play to load vars for managed_node2 10896 1726882173.65728: done sending task result for task 12673a56-9f93-8b02-b216-000000000031 10896 1726882173.65732: WORKER PROCESS EXITING 10896 1726882173.66614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882173.67903: done with get_vars() 10896 1726882173.67918: done getting variables 10896 1726882173.67959: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:29:33 -0400 (0:00:00.097) 0:00:15.246 ****** 10896 1726882173.67981: entering _queue_task() for managed_node2/service 10896 1726882173.68208: worker is 1 (out of 1 available) 10896 1726882173.68220: exiting _queue_task() for managed_node2/service 10896 1726882173.68233: done queuing things up, now waiting for results queue to drain 10896 1726882173.68234: waiting for pending results... 10896 1726882173.68396: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10896 1726882173.68484: in run() - task 12673a56-9f93-8b02-b216-000000000032 10896 1726882173.68500: variable 'ansible_search_path' from source: unknown 10896 1726882173.68503: variable 'ansible_search_path' from source: unknown 10896 1726882173.68531: calling self._execute() 10896 1726882173.68601: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.68606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.68614: variable 'omit' from source: magic vars 10896 1726882173.68920: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.68927: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882173.69082: variable 'network_provider' from source: set_fact 10896 1726882173.69085: variable 'network_state' from source: role '' defaults 10896 1726882173.69091: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10896 1726882173.69100: variable 'omit' from source: magic vars 10896 1726882173.69136: variable 'omit' from source: magic vars 10896 1726882173.69157: variable 'network_service_name' from source: role '' defaults 10896 1726882173.69269: variable 'network_service_name' from source: role '' defaults 10896 1726882173.69417: variable '__network_provider_setup' from source: role '' defaults 10896 1726882173.69420: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882173.69474: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882173.69481: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882173.69534: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882173.69682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882173.71764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882173.71767: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882173.71946: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882173.71949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882173.71952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882173.71955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.71969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.71986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.72044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.72052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.72114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.72129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.72157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.72399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.72402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.72480: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10896 1726882173.72605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.72636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.72664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.72715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.72739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.72843: variable 'ansible_python' from source: facts 10896 1726882173.72868: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10896 1726882173.72952: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882173.73048: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882173.73174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.73210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.73250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.73300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.73331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.73398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882173.73437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882173.73484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.73530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882173.73545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882173.73683: variable 'network_connections' from source: task vars 10896 1726882173.73704: variable 'controller_profile' from source: play vars 10896 1726882173.73777: variable 'controller_profile' from source: play vars 10896 1726882173.73798: variable 'controller_device' from source: play vars 10896 1726882173.74002: variable 'controller_device' from source: play vars 10896 1726882173.74005: variable 'port1_profile' from source: play vars 10896 1726882173.74007: variable 'port1_profile' from source: play vars 10896 1726882173.74009: variable 'dhcp_interface1' from source: play vars 10896 1726882173.74052: variable 'dhcp_interface1' from source: play vars 10896 1726882173.74068: variable 'controller_profile' from source: play vars 10896 1726882173.74148: variable 'controller_profile' from source: play vars 10896 1726882173.74162: variable 'port2_profile' from source: play vars 10896 1726882173.74243: variable 'port2_profile' from source: play vars 10896 1726882173.74268: variable 'dhcp_interface2' from source: play vars 10896 1726882173.74363: variable 'dhcp_interface2' from source: play vars 10896 1726882173.74386: variable 'controller_profile' from source: play vars 10896 1726882173.74475: variable 'controller_profile' from source: play vars 10896 1726882173.74576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882173.74853: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882173.74936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882173.75018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882173.75062: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882173.75133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882173.75181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882173.75242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882173.75283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882173.75360: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.75677: variable 'network_connections' from source: task vars 10896 1726882173.75687: variable 'controller_profile' from source: play vars 10896 1726882173.75770: variable 'controller_profile' from source: play vars 10896 1726882173.75784: variable 'controller_device' from source: play vars 10896 1726882173.75997: variable 'controller_device' from source: play vars 10896 1726882173.76002: variable 'port1_profile' from source: play vars 10896 1726882173.76005: variable 'port1_profile' from source: play vars 10896 1726882173.76007: variable 'dhcp_interface1' from source: play vars 10896 1726882173.76038: variable 'dhcp_interface1' from source: play vars 10896 1726882173.76051: variable 'controller_profile' from source: play vars 10896 1726882173.76129: variable 'controller_profile' from source: play vars 10896 1726882173.76143: variable 'port2_profile' from source: play vars 10896 1726882173.76212: variable 'port2_profile' from source: play vars 10896 1726882173.76230: variable 'dhcp_interface2' from source: play vars 10896 1726882173.76299: variable 'dhcp_interface2' from source: play vars 10896 1726882173.76317: variable 'controller_profile' from source: play vars 10896 1726882173.76392: variable 'controller_profile' from source: play vars 10896 1726882173.76458: variable '__network_packages_default_wireless' from source: role '' defaults 10896 1726882173.76541: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882173.76848: variable 'network_connections' from source: task vars 10896 1726882173.76880: variable 'controller_profile' from source: play vars 10896 1726882173.76937: variable 'controller_profile' from source: play vars 10896 1726882173.76949: variable 'controller_device' from source: play vars 10896 1726882173.77025: variable 'controller_device' from source: play vars 10896 1726882173.77101: variable 'port1_profile' from source: play vars 10896 1726882173.77108: variable 'port1_profile' from source: play vars 10896 1726882173.77120: variable 'dhcp_interface1' from source: play vars 10896 1726882173.77377: variable 'dhcp_interface1' from source: play vars 10896 1726882173.77473: variable 'controller_profile' from source: play vars 10896 1726882173.77476: variable 'controller_profile' from source: play vars 10896 1726882173.77478: variable 'port2_profile' from source: play vars 10896 1726882173.77540: variable 'port2_profile' from source: play vars 10896 1726882173.77715: variable 'dhcp_interface2' from source: play vars 10896 1726882173.77782: variable 'dhcp_interface2' from source: play vars 10896 1726882173.77798: variable 'controller_profile' from source: play vars 10896 1726882173.77870: variable 'controller_profile' from source: play vars 10896 1726882173.77965: variable '__network_packages_default_team' from source: role '' defaults 10896 1726882173.78164: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882173.78766: variable 'network_connections' from source: task vars 10896 1726882173.78780: variable 'controller_profile' from source: play vars 10896 1726882173.78859: variable 'controller_profile' from source: play vars 10896 1726882173.78871: variable 'controller_device' from source: play vars 10896 1726882173.78946: variable 'controller_device' from source: play vars 10896 1726882173.78960: variable 'port1_profile' from source: play vars 10896 1726882173.79039: variable 'port1_profile' from source: play vars 10896 1726882173.79052: variable 'dhcp_interface1' from source: play vars 10896 1726882173.79124: variable 'dhcp_interface1' from source: play vars 10896 1726882173.79141: variable 'controller_profile' from source: play vars 10896 1726882173.79213: variable 'controller_profile' from source: play vars 10896 1726882173.79226: variable 'port2_profile' from source: play vars 10896 1726882173.79303: variable 'port2_profile' from source: play vars 10896 1726882173.79316: variable 'dhcp_interface2' from source: play vars 10896 1726882173.79388: variable 'dhcp_interface2' from source: play vars 10896 1726882173.79405: variable 'controller_profile' from source: play vars 10896 1726882173.79478: variable 'controller_profile' from source: play vars 10896 1726882173.79552: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882173.79627: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882173.79639: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882173.79707: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882173.79932: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10896 1726882173.80547: variable 'network_connections' from source: task vars 10896 1726882173.80553: variable 'controller_profile' from source: play vars 10896 1726882173.80772: variable 'controller_profile' from source: play vars 10896 1726882173.80776: variable 'controller_device' from source: play vars 10896 1726882173.80811: variable 'controller_device' from source: play vars 10896 1726882173.80827: variable 'port1_profile' from source: play vars 10896 1726882173.80942: variable 'port1_profile' from source: play vars 10896 1726882173.81005: variable 'dhcp_interface1' from source: play vars 10896 1726882173.81299: variable 'dhcp_interface1' from source: play vars 10896 1726882173.81302: variable 'controller_profile' from source: play vars 10896 1726882173.81400: variable 'controller_profile' from source: play vars 10896 1726882173.81403: variable 'port2_profile' from source: play vars 10896 1726882173.81422: variable 'port2_profile' from source: play vars 10896 1726882173.81435: variable 'dhcp_interface2' from source: play vars 10896 1726882173.81498: variable 'dhcp_interface2' from source: play vars 10896 1726882173.81643: variable 'controller_profile' from source: play vars 10896 1726882173.81706: variable 'controller_profile' from source: play vars 10896 1726882173.81720: variable 'ansible_distribution' from source: facts 10896 1726882173.81729: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.81749: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.81800: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10896 1726882173.81989: variable 'ansible_distribution' from source: facts 10896 1726882173.82003: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.82013: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.82032: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10896 1726882173.82220: variable 'ansible_distribution' from source: facts 10896 1726882173.82230: variable '__network_rh_distros' from source: role '' defaults 10896 1726882173.82239: variable 'ansible_distribution_major_version' from source: facts 10896 1726882173.82275: variable 'network_provider' from source: set_fact 10896 1726882173.82310: variable 'omit' from source: magic vars 10896 1726882173.82341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882173.82372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882173.82402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882173.82424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882173.82437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882173.82466: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882173.82472: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.82478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.82697: Set connection var ansible_connection to ssh 10896 1726882173.82710: Set connection var ansible_timeout to 10 10896 1726882173.82935: Set connection var ansible_shell_type to sh 10896 1726882173.82938: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882173.82940: Set connection var ansible_shell_executable to /bin/sh 10896 1726882173.82942: Set connection var ansible_pipelining to False 10896 1726882173.82944: variable 'ansible_shell_executable' from source: unknown 10896 1726882173.82945: variable 'ansible_connection' from source: unknown 10896 1726882173.82947: variable 'ansible_module_compression' from source: unknown 10896 1726882173.82949: variable 'ansible_shell_type' from source: unknown 10896 1726882173.82951: variable 'ansible_shell_executable' from source: unknown 10896 1726882173.82952: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882173.82954: variable 'ansible_pipelining' from source: unknown 10896 1726882173.82955: variable 'ansible_timeout' from source: unknown 10896 1726882173.82957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882173.83082: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882173.83261: variable 'omit' from source: magic vars 10896 1726882173.83264: starting attempt loop 10896 1726882173.83266: running the handler 10896 1726882173.83268: variable 'ansible_facts' from source: unknown 10896 1726882173.84975: _low_level_execute_command(): starting 10896 1726882173.85200: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882173.86380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882173.86578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882173.86654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882173.86675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882173.86783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882173.88509: stdout chunk (state=3): >>>/root <<< 10896 1726882173.88847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882173.88857: stdout chunk (state=3): >>><<< 10896 1726882173.88868: stderr chunk (state=3): >>><<< 10896 1726882173.88890: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882173.89092: _low_level_execute_command(): starting 10896 1726882173.89101: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507 `" && echo ansible-tmp-1726882173.890029-11707-224596579543507="` echo /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507 `" ) && sleep 0' 10896 1726882173.90266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882173.90613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882173.90632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882173.90826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882173.92703: stdout chunk (state=3): >>>ansible-tmp-1726882173.890029-11707-224596579543507=/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507 <<< 10896 1726882173.92853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882173.92863: stdout chunk (state=3): >>><<< 10896 1726882173.92880: stderr chunk (state=3): >>><<< 10896 1726882173.92912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882173.890029-11707-224596579543507=/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882173.93184: variable 'ansible_module_compression' from source: unknown 10896 1726882173.93352: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 10896 1726882173.93433: ANSIBALLZ: Acquiring lock 10896 1726882173.93452: ANSIBALLZ: Lock acquired: 139646160836496 10896 1726882173.93475: ANSIBALLZ: Creating module 10896 1726882174.73105: ANSIBALLZ: Writing module into payload 10896 1726882174.73110: ANSIBALLZ: Writing module 10896 1726882174.73323: ANSIBALLZ: Renaming module 10896 1726882174.73329: ANSIBALLZ: Done creating module 10896 1726882174.73351: variable 'ansible_facts' from source: unknown 10896 1726882174.73949: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py 10896 1726882174.74303: Sending initial data 10896 1726882174.74307: Sent initial data (155 bytes) 10896 1726882174.75609: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882174.75808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882174.77440: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10896 1726882174.77444: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882174.77508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882174.77575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpjznlb8vu /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py <<< 10896 1726882174.77581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py" <<< 10896 1726882174.77643: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpjznlb8vu" to remote "/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py" <<< 10896 1726882174.80470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882174.80712: stderr chunk (state=3): >>><<< 10896 1726882174.80716: stdout chunk (state=3): >>><<< 10896 1726882174.80733: done transferring module to remote 10896 1726882174.80744: _low_level_execute_command(): starting 10896 1726882174.80749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/ /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py && sleep 0' 10896 1726882174.81770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882174.81778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882174.82410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882174.82421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882174.82585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882174.84558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882174.84562: stdout chunk (state=3): >>><<< 10896 1726882174.84568: stderr chunk (state=3): >>><<< 10896 1726882174.84584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882174.84587: _low_level_execute_command(): starting 10896 1726882174.84594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/AnsiballZ_systemd.py && sleep 0' 10896 1726882174.85877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882174.85943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882174.86016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882174.86042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882174.86204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882175.14783: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "3887104", "MemoryPeak": "4411392", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313082368", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "360428000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 10896 1726882175.14929: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10896 1726882175.16529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882175.16544: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 10896 1726882175.16685: stderr chunk (state=3): >>><<< 10896 1726882175.16688: stdout chunk (state=3): >>><<< 10896 1726882175.16710: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "3887104", "MemoryPeak": "4411392", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313082368", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "360428000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882175.17213: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882175.17218: _low_level_execute_command(): starting 10896 1726882175.17221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882173.890029-11707-224596579543507/ > /dev/null 2>&1 && sleep 0' 10896 1726882175.18424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882175.18435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882175.18446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882175.18595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882175.18753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882175.19024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882175.20731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882175.20735: stdout chunk (state=3): >>><<< 10896 1726882175.20812: stderr chunk (state=3): >>><<< 10896 1726882175.20816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882175.20818: handler run complete 10896 1726882175.21027: attempt loop complete, returning result 10896 1726882175.21030: _execute() done 10896 1726882175.21033: dumping result to json 10896 1726882175.21053: done dumping result, returning 10896 1726882175.21063: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-8b02-b216-000000000032] 10896 1726882175.21067: sending task result for task 12673a56-9f93-8b02-b216-000000000032 10896 1726882175.22628: done sending task result for task 12673a56-9f93-8b02-b216-000000000032 10896 1726882175.22631: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882175.22677: no more pending results, returning what we have 10896 1726882175.22680: results queue empty 10896 1726882175.22680: checking for any_errors_fatal 10896 1726882175.22685: done checking for any_errors_fatal 10896 1726882175.22685: checking for max_fail_percentage 10896 1726882175.22687: done checking for max_fail_percentage 10896 1726882175.22687: checking to see if all hosts have failed and the running result is not ok 10896 1726882175.22688: done checking to see if all hosts have failed 10896 1726882175.22689: getting the remaining hosts for this loop 10896 1726882175.22690: done getting the remaining hosts for this loop 10896 1726882175.22695: getting the next task for host managed_node2 10896 1726882175.22700: done getting next task for host managed_node2 10896 1726882175.22703: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10896 1726882175.22705: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882175.22715: getting variables 10896 1726882175.22716: in VariableManager get_vars() 10896 1726882175.22749: Calling all_inventory to load vars for managed_node2 10896 1726882175.22751: Calling groups_inventory to load vars for managed_node2 10896 1726882175.22753: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882175.22761: Calling all_plugins_play to load vars for managed_node2 10896 1726882175.22764: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882175.22766: Calling groups_plugins_play to load vars for managed_node2 10896 1726882175.25005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882175.26855: done with get_vars() 10896 1726882175.26886: done getting variables 10896 1726882175.26954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:29:35 -0400 (0:00:01.590) 0:00:16.836 ****** 10896 1726882175.26992: entering _queue_task() for managed_node2/service 10896 1726882175.27654: worker is 1 (out of 1 available) 10896 1726882175.27666: exiting _queue_task() for managed_node2/service 10896 1726882175.27740: done queuing things up, now waiting for results queue to drain 10896 1726882175.27742: waiting for pending results... 10896 1726882175.28425: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10896 1726882175.28432: in run() - task 12673a56-9f93-8b02-b216-000000000033 10896 1726882175.28454: variable 'ansible_search_path' from source: unknown 10896 1726882175.28463: variable 'ansible_search_path' from source: unknown 10896 1726882175.28599: calling self._execute() 10896 1726882175.28692: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.28854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.28955: variable 'omit' from source: magic vars 10896 1726882175.29636: variable 'ansible_distribution_major_version' from source: facts 10896 1726882175.29653: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882175.29876: variable 'network_provider' from source: set_fact 10896 1726882175.29888: Evaluated conditional (network_provider == "nm"): True 10896 1726882175.30019: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882175.30305: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882175.30797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882175.34906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882175.34912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882175.34954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882175.34988: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882175.35018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882175.35121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882175.35141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882175.35168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882175.35213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882175.35228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882175.35283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882175.35309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882175.35334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882175.35560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882175.35565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882175.35568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882175.35571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882175.35574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882175.35577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882175.35580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882175.35778: variable 'network_connections' from source: task vars 10896 1726882175.35782: variable 'controller_profile' from source: play vars 10896 1726882175.35801: variable 'controller_profile' from source: play vars 10896 1726882175.35811: variable 'controller_device' from source: play vars 10896 1726882175.35877: variable 'controller_device' from source: play vars 10896 1726882175.35905: variable 'port1_profile' from source: play vars 10896 1726882175.35986: variable 'port1_profile' from source: play vars 10896 1726882175.36009: variable 'dhcp_interface1' from source: play vars 10896 1726882175.36078: variable 'dhcp_interface1' from source: play vars 10896 1726882175.36099: variable 'controller_profile' from source: play vars 10896 1726882175.36172: variable 'controller_profile' from source: play vars 10896 1726882175.36184: variable 'port2_profile' from source: play vars 10896 1726882175.36259: variable 'port2_profile' from source: play vars 10896 1726882175.36274: variable 'dhcp_interface2' from source: play vars 10896 1726882175.36400: variable 'dhcp_interface2' from source: play vars 10896 1726882175.36404: variable 'controller_profile' from source: play vars 10896 1726882175.36425: variable 'controller_profile' from source: play vars 10896 1726882175.36514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882175.36704: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882175.36744: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882175.36789: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882175.36828: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882175.36885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882175.36916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882175.36979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882175.36999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882175.37045: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882175.37413: variable 'network_connections' from source: task vars 10896 1726882175.37417: variable 'controller_profile' from source: play vars 10896 1726882175.37424: variable 'controller_profile' from source: play vars 10896 1726882175.37426: variable 'controller_device' from source: play vars 10896 1726882175.37474: variable 'controller_device' from source: play vars 10896 1726882175.37487: variable 'port1_profile' from source: play vars 10896 1726882175.37562: variable 'port1_profile' from source: play vars 10896 1726882175.37573: variable 'dhcp_interface1' from source: play vars 10896 1726882175.37646: variable 'dhcp_interface1' from source: play vars 10896 1726882175.37658: variable 'controller_profile' from source: play vars 10896 1726882175.37721: variable 'controller_profile' from source: play vars 10896 1726882175.37741: variable 'port2_profile' from source: play vars 10896 1726882175.37821: variable 'port2_profile' from source: play vars 10896 1726882175.37833: variable 'dhcp_interface2' from source: play vars 10896 1726882175.37906: variable 'dhcp_interface2' from source: play vars 10896 1726882175.37917: variable 'controller_profile' from source: play vars 10896 1726882175.37987: variable 'controller_profile' from source: play vars 10896 1726882175.38035: Evaluated conditional (__network_wpa_supplicant_required): False 10896 1726882175.38064: when evaluation is False, skipping this task 10896 1726882175.38069: _execute() done 10896 1726882175.38074: dumping result to json 10896 1726882175.38077: done dumping result, returning 10896 1726882175.38174: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-8b02-b216-000000000033] 10896 1726882175.38177: sending task result for task 12673a56-9f93-8b02-b216-000000000033 10896 1726882175.38252: done sending task result for task 12673a56-9f93-8b02-b216-000000000033 10896 1726882175.38256: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10896 1726882175.38337: no more pending results, returning what we have 10896 1726882175.38340: results queue empty 10896 1726882175.38341: checking for any_errors_fatal 10896 1726882175.38359: done checking for any_errors_fatal 10896 1726882175.38360: checking for max_fail_percentage 10896 1726882175.38361: done checking for max_fail_percentage 10896 1726882175.38362: checking to see if all hosts have failed and the running result is not ok 10896 1726882175.38363: done checking to see if all hosts have failed 10896 1726882175.38364: getting the remaining hosts for this loop 10896 1726882175.38365: done getting the remaining hosts for this loop 10896 1726882175.38369: getting the next task for host managed_node2 10896 1726882175.38375: done getting next task for host managed_node2 10896 1726882175.38379: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10896 1726882175.38387: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882175.38405: getting variables 10896 1726882175.38407: in VariableManager get_vars() 10896 1726882175.38454: Calling all_inventory to load vars for managed_node2 10896 1726882175.38457: Calling groups_inventory to load vars for managed_node2 10896 1726882175.38460: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882175.38470: Calling all_plugins_play to load vars for managed_node2 10896 1726882175.38473: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882175.38475: Calling groups_plugins_play to load vars for managed_node2 10896 1726882175.40222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882175.41920: done with get_vars() 10896 1726882175.41944: done getting variables 10896 1726882175.42015: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:29:35 -0400 (0:00:00.150) 0:00:16.987 ****** 10896 1726882175.42049: entering _queue_task() for managed_node2/service 10896 1726882175.42399: worker is 1 (out of 1 available) 10896 1726882175.42522: exiting _queue_task() for managed_node2/service 10896 1726882175.42533: done queuing things up, now waiting for results queue to drain 10896 1726882175.42534: waiting for pending results... 10896 1726882175.42747: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 10896 1726882175.42856: in run() - task 12673a56-9f93-8b02-b216-000000000034 10896 1726882175.42880: variable 'ansible_search_path' from source: unknown 10896 1726882175.42887: variable 'ansible_search_path' from source: unknown 10896 1726882175.42931: calling self._execute() 10896 1726882175.43033: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.43061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.43064: variable 'omit' from source: magic vars 10896 1726882175.43499: variable 'ansible_distribution_major_version' from source: facts 10896 1726882175.43503: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882175.43626: variable 'network_provider' from source: set_fact 10896 1726882175.43639: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882175.43712: when evaluation is False, skipping this task 10896 1726882175.43715: _execute() done 10896 1726882175.43717: dumping result to json 10896 1726882175.43720: done dumping result, returning 10896 1726882175.43722: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-8b02-b216-000000000034] 10896 1726882175.43725: sending task result for task 12673a56-9f93-8b02-b216-000000000034 10896 1726882175.43796: done sending task result for task 12673a56-9f93-8b02-b216-000000000034 10896 1726882175.43800: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882175.43859: no more pending results, returning what we have 10896 1726882175.43863: results queue empty 10896 1726882175.43864: checking for any_errors_fatal 10896 1726882175.43874: done checking for any_errors_fatal 10896 1726882175.43875: checking for max_fail_percentage 10896 1726882175.43877: done checking for max_fail_percentage 10896 1726882175.43878: checking to see if all hosts have failed and the running result is not ok 10896 1726882175.43879: done checking to see if all hosts have failed 10896 1726882175.43879: getting the remaining hosts for this loop 10896 1726882175.43881: done getting the remaining hosts for this loop 10896 1726882175.43884: getting the next task for host managed_node2 10896 1726882175.43891: done getting next task for host managed_node2 10896 1726882175.43898: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10896 1726882175.43902: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882175.43921: getting variables 10896 1726882175.43922: in VariableManager get_vars() 10896 1726882175.43964: Calling all_inventory to load vars for managed_node2 10896 1726882175.43967: Calling groups_inventory to load vars for managed_node2 10896 1726882175.43970: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882175.43982: Calling all_plugins_play to load vars for managed_node2 10896 1726882175.43985: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882175.43988: Calling groups_plugins_play to load vars for managed_node2 10896 1726882175.45643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882175.47488: done with get_vars() 10896 1726882175.47512: done getting variables 10896 1726882175.47569: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:29:35 -0400 (0:00:00.055) 0:00:17.043 ****** 10896 1726882175.47616: entering _queue_task() for managed_node2/copy 10896 1726882175.48060: worker is 1 (out of 1 available) 10896 1726882175.48071: exiting _queue_task() for managed_node2/copy 10896 1726882175.48082: done queuing things up, now waiting for results queue to drain 10896 1726882175.48083: waiting for pending results... 10896 1726882175.48415: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10896 1726882175.48423: in run() - task 12673a56-9f93-8b02-b216-000000000035 10896 1726882175.48440: variable 'ansible_search_path' from source: unknown 10896 1726882175.48448: variable 'ansible_search_path' from source: unknown 10896 1726882175.48499: calling self._execute() 10896 1726882175.48637: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.48641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.48644: variable 'omit' from source: magic vars 10896 1726882175.49037: variable 'ansible_distribution_major_version' from source: facts 10896 1726882175.49054: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882175.49185: variable 'network_provider' from source: set_fact 10896 1726882175.49200: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882175.49288: when evaluation is False, skipping this task 10896 1726882175.49296: _execute() done 10896 1726882175.49300: dumping result to json 10896 1726882175.49302: done dumping result, returning 10896 1726882175.49305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-8b02-b216-000000000035] 10896 1726882175.49307: sending task result for task 12673a56-9f93-8b02-b216-000000000035 10896 1726882175.49377: done sending task result for task 12673a56-9f93-8b02-b216-000000000035 10896 1726882175.49379: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10896 1726882175.49430: no more pending results, returning what we have 10896 1726882175.49433: results queue empty 10896 1726882175.49434: checking for any_errors_fatal 10896 1726882175.49441: done checking for any_errors_fatal 10896 1726882175.49442: checking for max_fail_percentage 10896 1726882175.49444: done checking for max_fail_percentage 10896 1726882175.49445: checking to see if all hosts have failed and the running result is not ok 10896 1726882175.49445: done checking to see if all hosts have failed 10896 1726882175.49446: getting the remaining hosts for this loop 10896 1726882175.49447: done getting the remaining hosts for this loop 10896 1726882175.49451: getting the next task for host managed_node2 10896 1726882175.49458: done getting next task for host managed_node2 10896 1726882175.49462: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10896 1726882175.49466: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882175.49482: getting variables 10896 1726882175.49484: in VariableManager get_vars() 10896 1726882175.49529: Calling all_inventory to load vars for managed_node2 10896 1726882175.49532: Calling groups_inventory to load vars for managed_node2 10896 1726882175.49535: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882175.49546: Calling all_plugins_play to load vars for managed_node2 10896 1726882175.49549: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882175.49551: Calling groups_plugins_play to load vars for managed_node2 10896 1726882175.51186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882175.53671: done with get_vars() 10896 1726882175.53702: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:29:35 -0400 (0:00:00.061) 0:00:17.104 ****** 10896 1726882175.53797: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 10896 1726882175.53799: Creating lock for fedora.linux_system_roles.network_connections 10896 1726882175.54246: worker is 1 (out of 1 available) 10896 1726882175.54258: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 10896 1726882175.54269: done queuing things up, now waiting for results queue to drain 10896 1726882175.54270: waiting for pending results... 10896 1726882175.54779: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10896 1726882175.54920: in run() - task 12673a56-9f93-8b02-b216-000000000036 10896 1726882175.54935: variable 'ansible_search_path' from source: unknown 10896 1726882175.54938: variable 'ansible_search_path' from source: unknown 10896 1726882175.55032: calling self._execute() 10896 1726882175.55245: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.55249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.55255: variable 'omit' from source: magic vars 10896 1726882175.56077: variable 'ansible_distribution_major_version' from source: facts 10896 1726882175.56186: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882175.56190: variable 'omit' from source: magic vars 10896 1726882175.56197: variable 'omit' from source: magic vars 10896 1726882175.56590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882175.61005: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882175.61136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882175.61178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882175.61218: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882175.61312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882175.61448: variable 'network_provider' from source: set_fact 10896 1726882175.61746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882175.62561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882175.62646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882175.62690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882175.62903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882175.62975: variable 'omit' from source: magic vars 10896 1726882175.63165: variable 'omit' from source: magic vars 10896 1726882175.63487: variable 'network_connections' from source: task vars 10896 1726882175.63509: variable 'controller_profile' from source: play vars 10896 1726882175.63568: variable 'controller_profile' from source: play vars 10896 1726882175.63572: variable 'controller_device' from source: play vars 10896 1726882175.63748: variable 'controller_device' from source: play vars 10896 1726882175.63757: variable 'port1_profile' from source: play vars 10896 1726882175.63847: variable 'port1_profile' from source: play vars 10896 1726882175.63854: variable 'dhcp_interface1' from source: play vars 10896 1726882175.63911: variable 'dhcp_interface1' from source: play vars 10896 1726882175.63918: variable 'controller_profile' from source: play vars 10896 1726882175.64091: variable 'controller_profile' from source: play vars 10896 1726882175.64186: variable 'port2_profile' from source: play vars 10896 1726882175.64217: variable 'port2_profile' from source: play vars 10896 1726882175.64224: variable 'dhcp_interface2' from source: play vars 10896 1726882175.64401: variable 'dhcp_interface2' from source: play vars 10896 1726882175.64407: variable 'controller_profile' from source: play vars 10896 1726882175.64500: variable 'controller_profile' from source: play vars 10896 1726882175.64977: variable 'omit' from source: magic vars 10896 1726882175.64986: variable '__lsr_ansible_managed' from source: task vars 10896 1726882175.65122: variable '__lsr_ansible_managed' from source: task vars 10896 1726882175.65705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10896 1726882175.65874: Loaded config def from plugin (lookup/template) 10896 1726882175.65877: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10896 1726882175.66019: File lookup term: get_ansible_managed.j2 10896 1726882175.66024: variable 'ansible_search_path' from source: unknown 10896 1726882175.66027: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10896 1726882175.66074: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10896 1726882175.66077: variable 'ansible_search_path' from source: unknown 10896 1726882175.77870: variable 'ansible_managed' from source: unknown 10896 1726882175.78007: variable 'omit' from source: magic vars 10896 1726882175.78106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882175.78110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882175.78119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882175.78300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882175.78303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882175.78306: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882175.78308: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.78310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.78312: Set connection var ansible_connection to ssh 10896 1726882175.78314: Set connection var ansible_timeout to 10 10896 1726882175.78316: Set connection var ansible_shell_type to sh 10896 1726882175.78318: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882175.78320: Set connection var ansible_shell_executable to /bin/sh 10896 1726882175.78330: Set connection var ansible_pipelining to False 10896 1726882175.78674: variable 'ansible_shell_executable' from source: unknown 10896 1726882175.78678: variable 'ansible_connection' from source: unknown 10896 1726882175.78680: variable 'ansible_module_compression' from source: unknown 10896 1726882175.78682: variable 'ansible_shell_type' from source: unknown 10896 1726882175.78684: variable 'ansible_shell_executable' from source: unknown 10896 1726882175.78686: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882175.78701: variable 'ansible_pipelining' from source: unknown 10896 1726882175.78703: variable 'ansible_timeout' from source: unknown 10896 1726882175.78705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882175.78890: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882175.79105: variable 'omit' from source: magic vars 10896 1726882175.79110: starting attempt loop 10896 1726882175.79113: running the handler 10896 1726882175.79115: _low_level_execute_command(): starting 10896 1726882175.79117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882175.80800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882175.80805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882175.80807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882175.80810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882175.80918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882175.82594: stdout chunk (state=3): >>>/root <<< 10896 1726882175.82681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882175.82729: stderr chunk (state=3): >>><<< 10896 1726882175.82812: stdout chunk (state=3): >>><<< 10896 1726882175.82836: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882175.82848: _low_level_execute_command(): starting 10896 1726882175.82855: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213 `" && echo ansible-tmp-1726882175.8283577-11784-274199725408213="` echo /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213 `" ) && sleep 0' 10896 1726882175.84182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882175.84310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882175.84321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882175.84352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882175.84364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882175.84371: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882175.84381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882175.84400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882175.84407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882175.84414: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882175.84426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882175.84432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882175.84444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882175.84452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882175.84458: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882175.84535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882175.84647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882175.84755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882175.84863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882175.86830: stdout chunk (state=3): >>>ansible-tmp-1726882175.8283577-11784-274199725408213=/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213 <<< 10896 1726882175.87152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882175.87167: stdout chunk (state=3): >>><<< 10896 1726882175.87170: stderr chunk (state=3): >>><<< 10896 1726882175.87173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882175.8283577-11784-274199725408213=/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882175.87268: variable 'ansible_module_compression' from source: unknown 10896 1726882175.87278: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 10896 1726882175.87281: ANSIBALLZ: Acquiring lock 10896 1726882175.87283: ANSIBALLZ: Lock acquired: 139646160938736 10896 1726882175.87285: ANSIBALLZ: Creating module 10896 1726882176.22154: ANSIBALLZ: Writing module into payload 10896 1726882176.22536: ANSIBALLZ: Writing module 10896 1726882176.22539: ANSIBALLZ: Renaming module 10896 1726882176.22541: ANSIBALLZ: Done creating module 10896 1726882176.22545: variable 'ansible_facts' from source: unknown 10896 1726882176.22676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py 10896 1726882176.22796: Sending initial data 10896 1726882176.22800: Sent initial data (168 bytes) 10896 1726882176.23297: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882176.23301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.23304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882176.23306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882176.23308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.23353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.23356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.23442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882176.24992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882176.25066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882176.25126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpaaxm9wsc /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py <<< 10896 1726882176.25130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py" <<< 10896 1726882176.25192: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 10896 1726882176.25198: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpaaxm9wsc" to remote "/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py" <<< 10896 1726882176.26569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882176.26633: stderr chunk (state=3): >>><<< 10896 1726882176.26638: stdout chunk (state=3): >>><<< 10896 1726882176.26641: done transferring module to remote 10896 1726882176.26643: _low_level_execute_command(): starting 10896 1726882176.26673: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/ /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py && sleep 0' 10896 1726882176.27076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882176.27079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882176.27083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882176.27086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882176.27088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.27156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.27160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.27235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882176.28974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882176.29000: stderr chunk (state=3): >>><<< 10896 1726882176.29002: stdout chunk (state=3): >>><<< 10896 1726882176.29058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882176.29061: _low_level_execute_command(): starting 10896 1726882176.29065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/AnsiballZ_network_connections.py && sleep 0' 10896 1726882176.29401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882176.29405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.29431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882176.29434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.29459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.29476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.29545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882176.70859: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10896 1726882176.73007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882176.73013: stdout chunk (state=3): >>><<< 10896 1726882176.73021: stderr chunk (state=3): >>><<< 10896 1726882176.73025: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882176.73027: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882176.73033: _low_level_execute_command(): starting 10896 1726882176.73035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882175.8283577-11784-274199725408213/ > /dev/null 2>&1 && sleep 0' 10896 1726882176.73677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.73716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882176.73728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.73753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.73854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882176.75645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882176.75667: stderr chunk (state=3): >>><<< 10896 1726882176.75671: stdout chunk (state=3): >>><<< 10896 1726882176.75683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882176.75689: handler run complete 10896 1726882176.75723: attempt loop complete, returning result 10896 1726882176.75726: _execute() done 10896 1726882176.75729: dumping result to json 10896 1726882176.75735: done dumping result, returning 10896 1726882176.75743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-8b02-b216-000000000036] 10896 1726882176.75748: sending task result for task 12673a56-9f93-8b02-b216-000000000036 10896 1726882176.75872: done sending task result for task 12673a56-9f93-8b02-b216-000000000036 10896 1726882176.75875: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active) 10896 1726882176.76052: no more pending results, returning what we have 10896 1726882176.76055: results queue empty 10896 1726882176.76055: checking for any_errors_fatal 10896 1726882176.76060: done checking for any_errors_fatal 10896 1726882176.76061: checking for max_fail_percentage 10896 1726882176.76063: done checking for max_fail_percentage 10896 1726882176.76064: checking to see if all hosts have failed and the running result is not ok 10896 1726882176.76064: done checking to see if all hosts have failed 10896 1726882176.76065: getting the remaining hosts for this loop 10896 1726882176.76066: done getting the remaining hosts for this loop 10896 1726882176.76069: getting the next task for host managed_node2 10896 1726882176.76075: done getting next task for host managed_node2 10896 1726882176.76079: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10896 1726882176.76084: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882176.76095: getting variables 10896 1726882176.76096: in VariableManager get_vars() 10896 1726882176.76146: Calling all_inventory to load vars for managed_node2 10896 1726882176.76149: Calling groups_inventory to load vars for managed_node2 10896 1726882176.76152: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882176.76161: Calling all_plugins_play to load vars for managed_node2 10896 1726882176.76165: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882176.76168: Calling groups_plugins_play to load vars for managed_node2 10896 1726882176.77423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882176.78272: done with get_vars() 10896 1726882176.78292: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:29:36 -0400 (0:00:01.245) 0:00:18.350 ****** 10896 1726882176.78356: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 10896 1726882176.78357: Creating lock for fedora.linux_system_roles.network_state 10896 1726882176.78574: worker is 1 (out of 1 available) 10896 1726882176.78586: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 10896 1726882176.78600: done queuing things up, now waiting for results queue to drain 10896 1726882176.78601: waiting for pending results... 10896 1726882176.78761: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 10896 1726882176.78841: in run() - task 12673a56-9f93-8b02-b216-000000000037 10896 1726882176.78854: variable 'ansible_search_path' from source: unknown 10896 1726882176.78858: variable 'ansible_search_path' from source: unknown 10896 1726882176.78886: calling self._execute() 10896 1726882176.78958: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.78962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.78970: variable 'omit' from source: magic vars 10896 1726882176.79240: variable 'ansible_distribution_major_version' from source: facts 10896 1726882176.79249: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882176.79334: variable 'network_state' from source: role '' defaults 10896 1726882176.79342: Evaluated conditional (network_state != {}): False 10896 1726882176.79345: when evaluation is False, skipping this task 10896 1726882176.79348: _execute() done 10896 1726882176.79350: dumping result to json 10896 1726882176.79353: done dumping result, returning 10896 1726882176.79361: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-8b02-b216-000000000037] 10896 1726882176.79367: sending task result for task 12673a56-9f93-8b02-b216-000000000037 10896 1726882176.79449: done sending task result for task 12673a56-9f93-8b02-b216-000000000037 10896 1726882176.79451: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882176.79533: no more pending results, returning what we have 10896 1726882176.79536: results queue empty 10896 1726882176.79536: checking for any_errors_fatal 10896 1726882176.79543: done checking for any_errors_fatal 10896 1726882176.79544: checking for max_fail_percentage 10896 1726882176.79546: done checking for max_fail_percentage 10896 1726882176.79546: checking to see if all hosts have failed and the running result is not ok 10896 1726882176.79547: done checking to see if all hosts have failed 10896 1726882176.79548: getting the remaining hosts for this loop 10896 1726882176.79549: done getting the remaining hosts for this loop 10896 1726882176.79552: getting the next task for host managed_node2 10896 1726882176.79557: done getting next task for host managed_node2 10896 1726882176.79560: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10896 1726882176.79562: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882176.79574: getting variables 10896 1726882176.79575: in VariableManager get_vars() 10896 1726882176.79611: Calling all_inventory to load vars for managed_node2 10896 1726882176.79613: Calling groups_inventory to load vars for managed_node2 10896 1726882176.79616: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882176.79623: Calling all_plugins_play to load vars for managed_node2 10896 1726882176.79625: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882176.79627: Calling groups_plugins_play to load vars for managed_node2 10896 1726882176.80329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882176.81191: done with get_vars() 10896 1726882176.81211: done getting variables 10896 1726882176.81256: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:29:36 -0400 (0:00:00.029) 0:00:18.379 ****** 10896 1726882176.81279: entering _queue_task() for managed_node2/debug 10896 1726882176.81481: worker is 1 (out of 1 available) 10896 1726882176.81496: exiting _queue_task() for managed_node2/debug 10896 1726882176.81508: done queuing things up, now waiting for results queue to drain 10896 1726882176.81509: waiting for pending results... 10896 1726882176.81672: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10896 1726882176.81750: in run() - task 12673a56-9f93-8b02-b216-000000000038 10896 1726882176.81762: variable 'ansible_search_path' from source: unknown 10896 1726882176.81765: variable 'ansible_search_path' from source: unknown 10896 1726882176.81792: calling self._execute() 10896 1726882176.81863: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.81867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.81875: variable 'omit' from source: magic vars 10896 1726882176.82136: variable 'ansible_distribution_major_version' from source: facts 10896 1726882176.82144: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882176.82150: variable 'omit' from source: magic vars 10896 1726882176.82192: variable 'omit' from source: magic vars 10896 1726882176.82219: variable 'omit' from source: magic vars 10896 1726882176.82248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882176.82278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882176.82294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882176.82309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.82318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.82340: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882176.82343: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.82345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.82417: Set connection var ansible_connection to ssh 10896 1726882176.82421: Set connection var ansible_timeout to 10 10896 1726882176.82424: Set connection var ansible_shell_type to sh 10896 1726882176.82431: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882176.82436: Set connection var ansible_shell_executable to /bin/sh 10896 1726882176.82441: Set connection var ansible_pipelining to False 10896 1726882176.82458: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.82461: variable 'ansible_connection' from source: unknown 10896 1726882176.82464: variable 'ansible_module_compression' from source: unknown 10896 1726882176.82466: variable 'ansible_shell_type' from source: unknown 10896 1726882176.82468: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.82470: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.82475: variable 'ansible_pipelining' from source: unknown 10896 1726882176.82477: variable 'ansible_timeout' from source: unknown 10896 1726882176.82481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.82578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882176.82588: variable 'omit' from source: magic vars 10896 1726882176.82594: starting attempt loop 10896 1726882176.82599: running the handler 10896 1726882176.82689: variable '__network_connections_result' from source: set_fact 10896 1726882176.82740: handler run complete 10896 1726882176.82753: attempt loop complete, returning result 10896 1726882176.82756: _execute() done 10896 1726882176.82759: dumping result to json 10896 1726882176.82761: done dumping result, returning 10896 1726882176.82768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-8b02-b216-000000000038] 10896 1726882176.82772: sending task result for task 12673a56-9f93-8b02-b216-000000000038 10896 1726882176.82852: done sending task result for task 12673a56-9f93-8b02-b216-000000000038 10896 1726882176.82855: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active)" ] } 10896 1726882176.82918: no more pending results, returning what we have 10896 1726882176.82921: results queue empty 10896 1726882176.82922: checking for any_errors_fatal 10896 1726882176.82928: done checking for any_errors_fatal 10896 1726882176.82928: checking for max_fail_percentage 10896 1726882176.82930: done checking for max_fail_percentage 10896 1726882176.82930: checking to see if all hosts have failed and the running result is not ok 10896 1726882176.82931: done checking to see if all hosts have failed 10896 1726882176.82932: getting the remaining hosts for this loop 10896 1726882176.82933: done getting the remaining hosts for this loop 10896 1726882176.82937: getting the next task for host managed_node2 10896 1726882176.82942: done getting next task for host managed_node2 10896 1726882176.82946: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10896 1726882176.82948: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882176.82958: getting variables 10896 1726882176.82959: in VariableManager get_vars() 10896 1726882176.82992: Calling all_inventory to load vars for managed_node2 10896 1726882176.82996: Calling groups_inventory to load vars for managed_node2 10896 1726882176.82998: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882176.83006: Calling all_plugins_play to load vars for managed_node2 10896 1726882176.83008: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882176.83010: Calling groups_plugins_play to load vars for managed_node2 10896 1726882176.83839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882176.84707: done with get_vars() 10896 1726882176.84724: done getting variables 10896 1726882176.84765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:29:36 -0400 (0:00:00.035) 0:00:18.414 ****** 10896 1726882176.84801: entering _queue_task() for managed_node2/debug 10896 1726882176.85034: worker is 1 (out of 1 available) 10896 1726882176.85047: exiting _queue_task() for managed_node2/debug 10896 1726882176.85060: done queuing things up, now waiting for results queue to drain 10896 1726882176.85061: waiting for pending results... 10896 1726882176.85235: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10896 1726882176.85329: in run() - task 12673a56-9f93-8b02-b216-000000000039 10896 1726882176.85338: variable 'ansible_search_path' from source: unknown 10896 1726882176.85341: variable 'ansible_search_path' from source: unknown 10896 1726882176.85370: calling self._execute() 10896 1726882176.85444: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.85447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.85456: variable 'omit' from source: magic vars 10896 1726882176.85728: variable 'ansible_distribution_major_version' from source: facts 10896 1726882176.85741: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882176.85744: variable 'omit' from source: magic vars 10896 1726882176.85778: variable 'omit' from source: magic vars 10896 1726882176.85805: variable 'omit' from source: magic vars 10896 1726882176.85835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882176.85866: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882176.85881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882176.85898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.85905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.85926: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882176.85929: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.85932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.86003: Set connection var ansible_connection to ssh 10896 1726882176.86009: Set connection var ansible_timeout to 10 10896 1726882176.86012: Set connection var ansible_shell_type to sh 10896 1726882176.86018: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882176.86024: Set connection var ansible_shell_executable to /bin/sh 10896 1726882176.86029: Set connection var ansible_pipelining to False 10896 1726882176.86048: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.86051: variable 'ansible_connection' from source: unknown 10896 1726882176.86054: variable 'ansible_module_compression' from source: unknown 10896 1726882176.86058: variable 'ansible_shell_type' from source: unknown 10896 1726882176.86060: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.86064: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.86066: variable 'ansible_pipelining' from source: unknown 10896 1726882176.86068: variable 'ansible_timeout' from source: unknown 10896 1726882176.86070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.86167: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882176.86175: variable 'omit' from source: magic vars 10896 1726882176.86189: starting attempt loop 10896 1726882176.86196: running the handler 10896 1726882176.86227: variable '__network_connections_result' from source: set_fact 10896 1726882176.86277: variable '__network_connections_result' from source: set_fact 10896 1726882176.86386: handler run complete 10896 1726882176.86410: attempt loop complete, returning result 10896 1726882176.86415: _execute() done 10896 1726882176.86418: dumping result to json 10896 1726882176.86422: done dumping result, returning 10896 1726882176.86431: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-8b02-b216-000000000039] 10896 1726882176.86434: sending task result for task 12673a56-9f93-8b02-b216-000000000039 10896 1726882176.86524: done sending task result for task 12673a56-9f93-8b02-b216-000000000039 10896 1726882176.86527: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, dd49500a-dfac-4315-8e04-f2fc7751ae5d (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 44428abe-10c8-4edc-8c8f-782783521b9e (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb (not-active)" ] } } 10896 1726882176.86625: no more pending results, returning what we have 10896 1726882176.86628: results queue empty 10896 1726882176.86633: checking for any_errors_fatal 10896 1726882176.86641: done checking for any_errors_fatal 10896 1726882176.86642: checking for max_fail_percentage 10896 1726882176.86643: done checking for max_fail_percentage 10896 1726882176.86644: checking to see if all hosts have failed and the running result is not ok 10896 1726882176.86645: done checking to see if all hosts have failed 10896 1726882176.86645: getting the remaining hosts for this loop 10896 1726882176.86646: done getting the remaining hosts for this loop 10896 1726882176.86649: getting the next task for host managed_node2 10896 1726882176.86654: done getting next task for host managed_node2 10896 1726882176.86657: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10896 1726882176.86659: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882176.86668: getting variables 10896 1726882176.86669: in VariableManager get_vars() 10896 1726882176.86713: Calling all_inventory to load vars for managed_node2 10896 1726882176.86716: Calling groups_inventory to load vars for managed_node2 10896 1726882176.86718: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882176.86726: Calling all_plugins_play to load vars for managed_node2 10896 1726882176.86728: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882176.86731: Calling groups_plugins_play to load vars for managed_node2 10896 1726882176.87681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882176.89315: done with get_vars() 10896 1726882176.89337: done getting variables 10896 1726882176.89397: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:29:36 -0400 (0:00:00.046) 0:00:18.461 ****** 10896 1726882176.89430: entering _queue_task() for managed_node2/debug 10896 1726882176.89748: worker is 1 (out of 1 available) 10896 1726882176.89761: exiting _queue_task() for managed_node2/debug 10896 1726882176.89774: done queuing things up, now waiting for results queue to drain 10896 1726882176.89775: waiting for pending results... 10896 1726882176.90108: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10896 1726882176.90205: in run() - task 12673a56-9f93-8b02-b216-00000000003a 10896 1726882176.90209: variable 'ansible_search_path' from source: unknown 10896 1726882176.90212: variable 'ansible_search_path' from source: unknown 10896 1726882176.90214: calling self._execute() 10896 1726882176.90336: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.90340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.90343: variable 'omit' from source: magic vars 10896 1726882176.90724: variable 'ansible_distribution_major_version' from source: facts 10896 1726882176.90728: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882176.90832: variable 'network_state' from source: role '' defaults 10896 1726882176.90842: Evaluated conditional (network_state != {}): False 10896 1726882176.90845: when evaluation is False, skipping this task 10896 1726882176.90848: _execute() done 10896 1726882176.90851: dumping result to json 10896 1726882176.90853: done dumping result, returning 10896 1726882176.90861: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-8b02-b216-00000000003a] 10896 1726882176.90866: sending task result for task 12673a56-9f93-8b02-b216-00000000003a 10896 1726882176.91029: done sending task result for task 12673a56-9f93-8b02-b216-00000000003a 10896 1726882176.91034: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 10896 1726882176.91081: no more pending results, returning what we have 10896 1726882176.91086: results queue empty 10896 1726882176.91087: checking for any_errors_fatal 10896 1726882176.91099: done checking for any_errors_fatal 10896 1726882176.91100: checking for max_fail_percentage 10896 1726882176.91102: done checking for max_fail_percentage 10896 1726882176.91103: checking to see if all hosts have failed and the running result is not ok 10896 1726882176.91104: done checking to see if all hosts have failed 10896 1726882176.91104: getting the remaining hosts for this loop 10896 1726882176.91106: done getting the remaining hosts for this loop 10896 1726882176.91109: getting the next task for host managed_node2 10896 1726882176.91116: done getting next task for host managed_node2 10896 1726882176.91120: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10896 1726882176.91123: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882176.91138: getting variables 10896 1726882176.91139: in VariableManager get_vars() 10896 1726882176.91181: Calling all_inventory to load vars for managed_node2 10896 1726882176.91184: Calling groups_inventory to load vars for managed_node2 10896 1726882176.91186: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882176.91348: Calling all_plugins_play to load vars for managed_node2 10896 1726882176.91352: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882176.91356: Calling groups_plugins_play to load vars for managed_node2 10896 1726882176.92221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882176.93086: done with get_vars() 10896 1726882176.93105: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:29:36 -0400 (0:00:00.037) 0:00:18.498 ****** 10896 1726882176.93175: entering _queue_task() for managed_node2/ping 10896 1726882176.93176: Creating lock for ping 10896 1726882176.93436: worker is 1 (out of 1 available) 10896 1726882176.93448: exiting _queue_task() for managed_node2/ping 10896 1726882176.93460: done queuing things up, now waiting for results queue to drain 10896 1726882176.93461: waiting for pending results... 10896 1726882176.93674: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10896 1726882176.93787: in run() - task 12673a56-9f93-8b02-b216-00000000003b 10896 1726882176.93804: variable 'ansible_search_path' from source: unknown 10896 1726882176.93808: variable 'ansible_search_path' from source: unknown 10896 1726882176.93850: calling self._execute() 10896 1726882176.93945: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.93949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.94065: variable 'omit' from source: magic vars 10896 1726882176.94329: variable 'ansible_distribution_major_version' from source: facts 10896 1726882176.94339: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882176.94345: variable 'omit' from source: magic vars 10896 1726882176.94396: variable 'omit' from source: magic vars 10896 1726882176.94500: variable 'omit' from source: magic vars 10896 1726882176.94503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882176.94507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882176.94514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882176.94530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.94542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882176.94840: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882176.94846: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.94848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.95086: Set connection var ansible_connection to ssh 10896 1726882176.95096: Set connection var ansible_timeout to 10 10896 1726882176.95101: Set connection var ansible_shell_type to sh 10896 1726882176.95152: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882176.95156: Set connection var ansible_shell_executable to /bin/sh 10896 1726882176.95160: Set connection var ansible_pipelining to False 10896 1726882176.95162: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.95165: variable 'ansible_connection' from source: unknown 10896 1726882176.95167: variable 'ansible_module_compression' from source: unknown 10896 1726882176.95169: variable 'ansible_shell_type' from source: unknown 10896 1726882176.95171: variable 'ansible_shell_executable' from source: unknown 10896 1726882176.95208: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882176.95211: variable 'ansible_pipelining' from source: unknown 10896 1726882176.95216: variable 'ansible_timeout' from source: unknown 10896 1726882176.95221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882176.95573: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882176.95590: variable 'omit' from source: magic vars 10896 1726882176.95595: starting attempt loop 10896 1726882176.95597: running the handler 10896 1726882176.95600: _low_level_execute_command(): starting 10896 1726882176.95602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882176.96127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882176.96134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882176.96138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.96225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.96260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.96345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882176.98049: stdout chunk (state=3): >>>/root <<< 10896 1726882176.98270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882176.98273: stdout chunk (state=3): >>><<< 10896 1726882176.98275: stderr chunk (state=3): >>><<< 10896 1726882176.98279: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882176.98289: _low_level_execute_command(): starting 10896 1726882176.98304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512 `" && echo ansible-tmp-1726882176.9825418-11842-264433652054512="` echo /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512 `" ) && sleep 0' 10896 1726882176.99636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882176.99640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882176.99643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.99646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882176.99654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882176.99841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882176.99905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882176.99975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.01947: stdout chunk (state=3): >>>ansible-tmp-1726882176.9825418-11842-264433652054512=/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512 <<< 10896 1726882177.01950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.01970: stderr chunk (state=3): >>><<< 10896 1726882177.01973: stdout chunk (state=3): >>><<< 10896 1726882177.01994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882176.9825418-11842-264433652054512=/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.02302: variable 'ansible_module_compression' from source: unknown 10896 1726882177.02305: ANSIBALLZ: Using lock for ping 10896 1726882177.02307: ANSIBALLZ: Acquiring lock 10896 1726882177.02309: ANSIBALLZ: Lock acquired: 139646165374960 10896 1726882177.02311: ANSIBALLZ: Creating module 10896 1726882177.24944: ANSIBALLZ: Writing module into payload 10896 1726882177.25012: ANSIBALLZ: Writing module 10896 1726882177.25047: ANSIBALLZ: Renaming module 10896 1726882177.25060: ANSIBALLZ: Done creating module 10896 1726882177.25081: variable 'ansible_facts' from source: unknown 10896 1726882177.25165: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py 10896 1726882177.25371: Sending initial data 10896 1726882177.25380: Sent initial data (153 bytes) 10896 1726882177.25985: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882177.26011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.26029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.26049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882177.26066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882177.26129: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.26182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882177.26210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.26235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.26363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.28011: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882177.28104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882177.28161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp0uvmnyce /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py <<< 10896 1726882177.28164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py" <<< 10896 1726882177.28239: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp0uvmnyce" to remote "/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py" <<< 10896 1726882177.29251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.29340: stderr chunk (state=3): >>><<< 10896 1726882177.29351: stdout chunk (state=3): >>><<< 10896 1726882177.29498: done transferring module to remote 10896 1726882177.29501: _low_level_execute_command(): starting 10896 1726882177.29504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/ /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py && sleep 0' 10896 1726882177.30725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.30729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882177.30732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.30750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.30791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.30876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.32613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.32653: stderr chunk (state=3): >>><<< 10896 1726882177.32657: stdout chunk (state=3): >>><<< 10896 1726882177.32681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.32686: _low_level_execute_command(): starting 10896 1726882177.32692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/AnsiballZ_ping.py && sleep 0' 10896 1726882177.33395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882177.33408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.33492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.33529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.33631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.48355: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10896 1726882177.49454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882177.49470: stderr chunk (state=3): >>><<< 10896 1726882177.49474: stdout chunk (state=3): >>><<< 10896 1726882177.49488: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882177.49517: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882177.49525: _low_level_execute_command(): starting 10896 1726882177.49529: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882176.9825418-11842-264433652054512/ > /dev/null 2>&1 && sleep 0' 10896 1726882177.50073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.50076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882177.50081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.50084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.50086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882177.50088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882177.50090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.50150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.50156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.50219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.52014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.52059: stderr chunk (state=3): >>><<< 10896 1726882177.52062: stdout chunk (state=3): >>><<< 10896 1726882177.52078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.52084: handler run complete 10896 1726882177.52098: attempt loop complete, returning result 10896 1726882177.52101: _execute() done 10896 1726882177.52104: dumping result to json 10896 1726882177.52107: done dumping result, returning 10896 1726882177.52116: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-8b02-b216-00000000003b] 10896 1726882177.52120: sending task result for task 12673a56-9f93-8b02-b216-00000000003b 10896 1726882177.52223: done sending task result for task 12673a56-9f93-8b02-b216-00000000003b 10896 1726882177.52226: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 10896 1726882177.52284: no more pending results, returning what we have 10896 1726882177.52287: results queue empty 10896 1726882177.52288: checking for any_errors_fatal 10896 1726882177.52297: done checking for any_errors_fatal 10896 1726882177.52298: checking for max_fail_percentage 10896 1726882177.52299: done checking for max_fail_percentage 10896 1726882177.52300: checking to see if all hosts have failed and the running result is not ok 10896 1726882177.52301: done checking to see if all hosts have failed 10896 1726882177.52302: getting the remaining hosts for this loop 10896 1726882177.52303: done getting the remaining hosts for this loop 10896 1726882177.52306: getting the next task for host managed_node2 10896 1726882177.52315: done getting next task for host managed_node2 10896 1726882177.52317: ^ task is: TASK: meta (role_complete) 10896 1726882177.52319: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882177.52334: getting variables 10896 1726882177.52337: in VariableManager get_vars() 10896 1726882177.52382: Calling all_inventory to load vars for managed_node2 10896 1726882177.52385: Calling groups_inventory to load vars for managed_node2 10896 1726882177.52387: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882177.52408: Calling all_plugins_play to load vars for managed_node2 10896 1726882177.52412: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882177.52416: Calling groups_plugins_play to load vars for managed_node2 10896 1726882177.53482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882177.54341: done with get_vars() 10896 1726882177.54360: done getting variables 10896 1726882177.54424: done queuing things up, now waiting for results queue to drain 10896 1726882177.54426: results queue empty 10896 1726882177.54426: checking for any_errors_fatal 10896 1726882177.54428: done checking for any_errors_fatal 10896 1726882177.54429: checking for max_fail_percentage 10896 1726882177.54429: done checking for max_fail_percentage 10896 1726882177.54430: checking to see if all hosts have failed and the running result is not ok 10896 1726882177.54430: done checking to see if all hosts have failed 10896 1726882177.54431: getting the remaining hosts for this loop 10896 1726882177.54431: done getting the remaining hosts for this loop 10896 1726882177.54433: getting the next task for host managed_node2 10896 1726882177.54436: done getting next task for host managed_node2 10896 1726882177.54438: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10896 1726882177.54439: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882177.54441: getting variables 10896 1726882177.54441: in VariableManager get_vars() 10896 1726882177.54451: Calling all_inventory to load vars for managed_node2 10896 1726882177.54452: Calling groups_inventory to load vars for managed_node2 10896 1726882177.54454: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882177.54457: Calling all_plugins_play to load vars for managed_node2 10896 1726882177.54459: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882177.54460: Calling groups_plugins_play to load vars for managed_node2 10896 1726882177.55103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882177.55966: done with get_vars() 10896 1726882177.55982: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:37 -0400 (0:00:00.628) 0:00:19.127 ****** 10896 1726882177.56040: entering _queue_task() for managed_node2/include_tasks 10896 1726882177.56299: worker is 1 (out of 1 available) 10896 1726882177.56311: exiting _queue_task() for managed_node2/include_tasks 10896 1726882177.56324: done queuing things up, now waiting for results queue to drain 10896 1726882177.56326: waiting for pending results... 10896 1726882177.56490: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 10896 1726882177.56571: in run() - task 12673a56-9f93-8b02-b216-00000000006e 10896 1726882177.56584: variable 'ansible_search_path' from source: unknown 10896 1726882177.56588: variable 'ansible_search_path' from source: unknown 10896 1726882177.56617: calling self._execute() 10896 1726882177.56686: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882177.56691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882177.56701: variable 'omit' from source: magic vars 10896 1726882177.57100: variable 'ansible_distribution_major_version' from source: facts 10896 1726882177.57107: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882177.57116: _execute() done 10896 1726882177.57138: dumping result to json 10896 1726882177.57141: done dumping result, returning 10896 1726882177.57143: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-8b02-b216-00000000006e] 10896 1726882177.57145: sending task result for task 12673a56-9f93-8b02-b216-00000000006e 10896 1726882177.57272: done sending task result for task 12673a56-9f93-8b02-b216-00000000006e 10896 1726882177.57275: WORKER PROCESS EXITING 10896 1726882177.57317: no more pending results, returning what we have 10896 1726882177.57322: in VariableManager get_vars() 10896 1726882177.57368: Calling all_inventory to load vars for managed_node2 10896 1726882177.57374: Calling groups_inventory to load vars for managed_node2 10896 1726882177.57376: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882177.57389: Calling all_plugins_play to load vars for managed_node2 10896 1726882177.57392: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882177.57398: Calling groups_plugins_play to load vars for managed_node2 10896 1726882177.58628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882177.59838: done with get_vars() 10896 1726882177.59860: variable 'ansible_search_path' from source: unknown 10896 1726882177.59862: variable 'ansible_search_path' from source: unknown 10896 1726882177.59895: we have included files to process 10896 1726882177.59896: generating all_blocks data 10896 1726882177.59898: done generating all_blocks data 10896 1726882177.59903: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882177.59904: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882177.59905: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10896 1726882177.60047: done processing included file 10896 1726882177.60048: iterating over new_blocks loaded from include file 10896 1726882177.60049: in VariableManager get_vars() 10896 1726882177.60061: done with get_vars() 10896 1726882177.60062: filtering new block on tags 10896 1726882177.60072: done filtering new block on tags 10896 1726882177.60074: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 10896 1726882177.60077: extending task lists for all hosts with included blocks 10896 1726882177.60159: done extending task lists 10896 1726882177.60160: done processing included files 10896 1726882177.60160: results queue empty 10896 1726882177.60161: checking for any_errors_fatal 10896 1726882177.60162: done checking for any_errors_fatal 10896 1726882177.60162: checking for max_fail_percentage 10896 1726882177.60163: done checking for max_fail_percentage 10896 1726882177.60163: checking to see if all hosts have failed and the running result is not ok 10896 1726882177.60164: done checking to see if all hosts have failed 10896 1726882177.60164: getting the remaining hosts for this loop 10896 1726882177.60165: done getting the remaining hosts for this loop 10896 1726882177.60167: getting the next task for host managed_node2 10896 1726882177.60169: done getting next task for host managed_node2 10896 1726882177.60171: ^ task is: TASK: Get stat for interface {{ interface }} 10896 1726882177.60173: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882177.60174: getting variables 10896 1726882177.60175: in VariableManager get_vars() 10896 1726882177.60183: Calling all_inventory to load vars for managed_node2 10896 1726882177.60185: Calling groups_inventory to load vars for managed_node2 10896 1726882177.60186: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882177.60190: Calling all_plugins_play to load vars for managed_node2 10896 1726882177.60191: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882177.60195: Calling groups_plugins_play to load vars for managed_node2 10896 1726882177.61154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882177.62159: done with get_vars() 10896 1726882177.62173: done getting variables 10896 1726882177.62284: variable 'interface' from source: task vars 10896 1726882177.62289: variable 'controller_device' from source: play vars 10896 1726882177.62331: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:37 -0400 (0:00:00.063) 0:00:19.190 ****** 10896 1726882177.62353: entering _queue_task() for managed_node2/stat 10896 1726882177.62582: worker is 1 (out of 1 available) 10896 1726882177.62596: exiting _queue_task() for managed_node2/stat 10896 1726882177.62609: done queuing things up, now waiting for results queue to drain 10896 1726882177.62611: waiting for pending results... 10896 1726882177.62776: running TaskExecutor() for managed_node2/TASK: Get stat for interface deprecated-bond 10896 1726882177.62865: in run() - task 12673a56-9f93-8b02-b216-000000000242 10896 1726882177.62876: variable 'ansible_search_path' from source: unknown 10896 1726882177.62880: variable 'ansible_search_path' from source: unknown 10896 1726882177.62956: calling self._execute() 10896 1726882177.62989: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882177.62999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882177.63005: variable 'omit' from source: magic vars 10896 1726882177.63258: variable 'ansible_distribution_major_version' from source: facts 10896 1726882177.63266: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882177.63273: variable 'omit' from source: magic vars 10896 1726882177.63315: variable 'omit' from source: magic vars 10896 1726882177.63379: variable 'interface' from source: task vars 10896 1726882177.63383: variable 'controller_device' from source: play vars 10896 1726882177.63433: variable 'controller_device' from source: play vars 10896 1726882177.63447: variable 'omit' from source: magic vars 10896 1726882177.63477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882177.63507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882177.63523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882177.63537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882177.63547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882177.63568: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882177.63571: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882177.63574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882177.63647: Set connection var ansible_connection to ssh 10896 1726882177.63653: Set connection var ansible_timeout to 10 10896 1726882177.63656: Set connection var ansible_shell_type to sh 10896 1726882177.63662: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882177.63667: Set connection var ansible_shell_executable to /bin/sh 10896 1726882177.63672: Set connection var ansible_pipelining to False 10896 1726882177.63690: variable 'ansible_shell_executable' from source: unknown 10896 1726882177.63697: variable 'ansible_connection' from source: unknown 10896 1726882177.63700: variable 'ansible_module_compression' from source: unknown 10896 1726882177.63702: variable 'ansible_shell_type' from source: unknown 10896 1726882177.63705: variable 'ansible_shell_executable' from source: unknown 10896 1726882177.63708: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882177.63710: variable 'ansible_pipelining' from source: unknown 10896 1726882177.63715: variable 'ansible_timeout' from source: unknown 10896 1726882177.63717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882177.63926: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882177.63931: variable 'omit' from source: magic vars 10896 1726882177.63934: starting attempt loop 10896 1726882177.63936: running the handler 10896 1726882177.63986: _low_level_execute_command(): starting 10896 1726882177.63990: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882177.64636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.64640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.64643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.64645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.64699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882177.64704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.64727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.64798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.66370: stdout chunk (state=3): >>>/root <<< 10896 1726882177.66469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.66506: stderr chunk (state=3): >>><<< 10896 1726882177.66521: stdout chunk (state=3): >>><<< 10896 1726882177.66540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.66559: _low_level_execute_command(): starting 10896 1726882177.66564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546 `" && echo ansible-tmp-1726882177.6654572-11880-115837168532546="` echo /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546 `" ) && sleep 0' 10896 1726882177.67079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.67089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882177.67095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.67165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.67228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.69077: stdout chunk (state=3): >>>ansible-tmp-1726882177.6654572-11880-115837168532546=/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546 <<< 10896 1726882177.69196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.69250: stderr chunk (state=3): >>><<< 10896 1726882177.69253: stdout chunk (state=3): >>><<< 10896 1726882177.69264: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882177.6654572-11880-115837168532546=/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.69325: variable 'ansible_module_compression' from source: unknown 10896 1726882177.69409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882177.69416: variable 'ansible_facts' from source: unknown 10896 1726882177.69503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py 10896 1726882177.69589: Sending initial data 10896 1726882177.69595: Sent initial data (153 bytes) 10896 1726882177.70127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.70131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882177.70133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.70135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.70137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.70210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.70214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.70266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.71778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 10896 1726882177.71782: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882177.71839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882177.71899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpah8qif0n /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py <<< 10896 1726882177.71901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py" <<< 10896 1726882177.71961: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpah8qif0n" to remote "/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py" <<< 10896 1726882177.72760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.72772: stderr chunk (state=3): >>><<< 10896 1726882177.72775: stdout chunk (state=3): >>><<< 10896 1726882177.72815: done transferring module to remote 10896 1726882177.72824: _low_level_execute_command(): starting 10896 1726882177.72828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/ /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py && sleep 0' 10896 1726882177.73327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.73364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.73368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.73430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.73434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.73519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.75215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.75237: stderr chunk (state=3): >>><<< 10896 1726882177.75241: stdout chunk (state=3): >>><<< 10896 1726882177.75254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.75258: _low_level_execute_command(): starting 10896 1726882177.75260: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/AnsiballZ_stat.py && sleep 0' 10896 1726882177.75690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.75704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882177.75710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882177.75712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.75714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882177.75759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882177.75763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.75843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.90845: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27454, "dev": 23, "nlink": 1, "atime": 1726882176.578643, "mtime": 1726882176.578643, "ctime": 1726882176.578643, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882177.92066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882177.92069: stdout chunk (state=3): >>><<< 10896 1726882177.92071: stderr chunk (state=3): >>><<< 10896 1726882177.92089: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27454, "dev": 23, "nlink": 1, "atime": 1726882176.578643, "mtime": 1726882176.578643, "ctime": 1726882176.578643, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882177.92235: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882177.92239: _low_level_execute_command(): starting 10896 1726882177.92246: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882177.6654572-11880-115837168532546/ > /dev/null 2>&1 && sleep 0' 10896 1726882177.93420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882177.93433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882177.93449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882177.93468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882177.93487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882177.93506: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882177.93603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882177.93618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882177.93758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882177.95604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882177.95614: stdout chunk (state=3): >>><<< 10896 1726882177.95627: stderr chunk (state=3): >>><<< 10896 1726882177.95655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882177.95672: handler run complete 10896 1726882177.95731: attempt loop complete, returning result 10896 1726882177.95742: _execute() done 10896 1726882177.95801: dumping result to json 10896 1726882177.95806: done dumping result, returning 10896 1726882177.95808: done running TaskExecutor() for managed_node2/TASK: Get stat for interface deprecated-bond [12673a56-9f93-8b02-b216-000000000242] 10896 1726882177.95811: sending task result for task 12673a56-9f93-8b02-b216-000000000242 ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882176.578643, "block_size": 4096, "blocks": 0, "ctime": 1726882176.578643, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27454, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1726882176.578643, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10896 1726882177.96163: no more pending results, returning what we have 10896 1726882177.96167: results queue empty 10896 1726882177.96168: checking for any_errors_fatal 10896 1726882177.96170: done checking for any_errors_fatal 10896 1726882177.96170: checking for max_fail_percentage 10896 1726882177.96172: done checking for max_fail_percentage 10896 1726882177.96173: checking to see if all hosts have failed and the running result is not ok 10896 1726882177.96174: done checking to see if all hosts have failed 10896 1726882177.96174: getting the remaining hosts for this loop 10896 1726882177.96176: done getting the remaining hosts for this loop 10896 1726882177.96245: getting the next task for host managed_node2 10896 1726882177.96289: done getting next task for host managed_node2 10896 1726882177.96296: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10896 1726882177.96302: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882177.96308: getting variables 10896 1726882177.96309: in VariableManager get_vars() 10896 1726882177.96388: Calling all_inventory to load vars for managed_node2 10896 1726882177.96391: Calling groups_inventory to load vars for managed_node2 10896 1726882177.96395: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882177.96407: Calling all_plugins_play to load vars for managed_node2 10896 1726882177.96410: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882177.96414: Calling groups_plugins_play to load vars for managed_node2 10896 1726882177.96934: done sending task result for task 12673a56-9f93-8b02-b216-000000000242 10896 1726882177.97101: WORKER PROCESS EXITING 10896 1726882177.99568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.01258: done with get_vars() 10896 1726882178.01292: done getting variables 10896 1726882178.01351: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882178.01459: variable 'interface' from source: task vars 10896 1726882178.01463: variable 'controller_device' from source: play vars 10896 1726882178.01532: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:38 -0400 (0:00:00.392) 0:00:19.582 ****** 10896 1726882178.01568: entering _queue_task() for managed_node2/assert 10896 1726882178.01932: worker is 1 (out of 1 available) 10896 1726882178.01945: exiting _queue_task() for managed_node2/assert 10896 1726882178.01959: done queuing things up, now waiting for results queue to drain 10896 1726882178.01960: waiting for pending results... 10896 1726882178.02159: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'deprecated-bond' 10896 1726882178.02535: in run() - task 12673a56-9f93-8b02-b216-00000000006f 10896 1726882178.02555: variable 'ansible_search_path' from source: unknown 10896 1726882178.02563: variable 'ansible_search_path' from source: unknown 10896 1726882178.02606: calling self._execute() 10896 1726882178.02701: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.03001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.03009: variable 'omit' from source: magic vars 10896 1726882178.03581: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.03604: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.03641: variable 'omit' from source: magic vars 10896 1726882178.03706: variable 'omit' from source: magic vars 10896 1726882178.03816: variable 'interface' from source: task vars 10896 1726882178.03827: variable 'controller_device' from source: play vars 10896 1726882178.03898: variable 'controller_device' from source: play vars 10896 1726882178.03924: variable 'omit' from source: magic vars 10896 1726882178.03968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882178.04010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882178.04035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882178.04059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.04074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.04112: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882178.04120: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.04128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.04232: Set connection var ansible_connection to ssh 10896 1726882178.04245: Set connection var ansible_timeout to 10 10896 1726882178.04253: Set connection var ansible_shell_type to sh 10896 1726882178.04266: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882178.04276: Set connection var ansible_shell_executable to /bin/sh 10896 1726882178.04288: Set connection var ansible_pipelining to False 10896 1726882178.04320: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.04327: variable 'ansible_connection' from source: unknown 10896 1726882178.04332: variable 'ansible_module_compression' from source: unknown 10896 1726882178.04337: variable 'ansible_shell_type' from source: unknown 10896 1726882178.04342: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.04346: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.04399: variable 'ansible_pipelining' from source: unknown 10896 1726882178.04402: variable 'ansible_timeout' from source: unknown 10896 1726882178.04404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.04504: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882178.04568: variable 'omit' from source: magic vars 10896 1726882178.04571: starting attempt loop 10896 1726882178.04574: running the handler 10896 1726882178.04801: variable 'interface_stat' from source: set_fact 10896 1726882178.04805: Evaluated conditional (interface_stat.stat.exists): True 10896 1726882178.04807: handler run complete 10896 1726882178.04810: attempt loop complete, returning result 10896 1726882178.04812: _execute() done 10896 1726882178.04814: dumping result to json 10896 1726882178.04817: done dumping result, returning 10896 1726882178.04819: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'deprecated-bond' [12673a56-9f93-8b02-b216-00000000006f] 10896 1726882178.04822: sending task result for task 12673a56-9f93-8b02-b216-00000000006f 10896 1726882178.04889: done sending task result for task 12673a56-9f93-8b02-b216-00000000006f 10896 1726882178.04892: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882178.04966: no more pending results, returning what we have 10896 1726882178.04970: results queue empty 10896 1726882178.04971: checking for any_errors_fatal 10896 1726882178.04982: done checking for any_errors_fatal 10896 1726882178.04983: checking for max_fail_percentage 10896 1726882178.04985: done checking for max_fail_percentage 10896 1726882178.04986: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.04987: done checking to see if all hosts have failed 10896 1726882178.04988: getting the remaining hosts for this loop 10896 1726882178.04990: done getting the remaining hosts for this loop 10896 1726882178.04996: getting the next task for host managed_node2 10896 1726882178.05007: done getting next task for host managed_node2 10896 1726882178.05013: ^ task is: TASK: Include the task 'assert_profile_present.yml' 10896 1726882178.05015: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.05022: getting variables 10896 1726882178.05024: in VariableManager get_vars() 10896 1726882178.05076: Calling all_inventory to load vars for managed_node2 10896 1726882178.05079: Calling groups_inventory to load vars for managed_node2 10896 1726882178.05082: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.05648: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.05653: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.05657: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.07223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.08840: done with get_vars() 10896 1726882178.08856: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Friday 20 September 2024 21:29:38 -0400 (0:00:00.073) 0:00:19.656 ****** 10896 1726882178.08944: entering _queue_task() for managed_node2/include_tasks 10896 1726882178.09198: worker is 1 (out of 1 available) 10896 1726882178.09213: exiting _queue_task() for managed_node2/include_tasks 10896 1726882178.09225: done queuing things up, now waiting for results queue to drain 10896 1726882178.09226: waiting for pending results... 10896 1726882178.09515: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 10896 1726882178.09543: in run() - task 12673a56-9f93-8b02-b216-000000000070 10896 1726882178.09563: variable 'ansible_search_path' from source: unknown 10896 1726882178.09616: variable 'controller_profile' from source: play vars 10896 1726882178.09815: variable 'controller_profile' from source: play vars 10896 1726882178.09840: variable 'port1_profile' from source: play vars 10896 1726882178.09909: variable 'port1_profile' from source: play vars 10896 1726882178.09944: variable 'port2_profile' from source: play vars 10896 1726882178.10013: variable 'port2_profile' from source: play vars 10896 1726882178.10034: variable 'omit' from source: magic vars 10896 1726882178.10174: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.10188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.10206: variable 'omit' from source: magic vars 10896 1726882178.10478: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.10481: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.10489: variable 'item' from source: unknown 10896 1726882178.10554: variable 'item' from source: unknown 10896 1726882178.10675: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.10687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.10704: variable 'omit' from source: magic vars 10896 1726882178.10901: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.10904: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.10907: variable 'item' from source: unknown 10896 1726882178.10984: variable 'item' from source: unknown 10896 1726882178.11040: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.11043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.11046: variable 'omit' from source: magic vars 10896 1726882178.11212: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.11215: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.11218: variable 'item' from source: unknown 10896 1726882178.11322: variable 'item' from source: unknown 10896 1726882178.11366: dumping result to json 10896 1726882178.11369: done dumping result, returning 10896 1726882178.11371: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-8b02-b216-000000000070] 10896 1726882178.11372: sending task result for task 12673a56-9f93-8b02-b216-000000000070 10896 1726882178.11406: done sending task result for task 12673a56-9f93-8b02-b216-000000000070 10896 1726882178.11409: WORKER PROCESS EXITING 10896 1726882178.11449: no more pending results, returning what we have 10896 1726882178.11453: in VariableManager get_vars() 10896 1726882178.11499: Calling all_inventory to load vars for managed_node2 10896 1726882178.11502: Calling groups_inventory to load vars for managed_node2 10896 1726882178.11504: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.11514: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.11517: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.11519: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.12716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.14150: done with get_vars() 10896 1726882178.14171: variable 'ansible_search_path' from source: unknown 10896 1726882178.14188: variable 'ansible_search_path' from source: unknown 10896 1726882178.14203: variable 'ansible_search_path' from source: unknown 10896 1726882178.14211: we have included files to process 10896 1726882178.14212: generating all_blocks data 10896 1726882178.14214: done generating all_blocks data 10896 1726882178.14218: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14220: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14222: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14429: in VariableManager get_vars() 10896 1726882178.14451: done with get_vars() 10896 1726882178.14726: done processing included file 10896 1726882178.14728: iterating over new_blocks loaded from include file 10896 1726882178.14730: in VariableManager get_vars() 10896 1726882178.14766: done with get_vars() 10896 1726882178.14768: filtering new block on tags 10896 1726882178.14789: done filtering new block on tags 10896 1726882178.14792: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0) 10896 1726882178.14802: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14803: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14806: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.14979: in VariableManager get_vars() 10896 1726882178.15103: done with get_vars() 10896 1726882178.15401: done processing included file 10896 1726882178.15403: iterating over new_blocks loaded from include file 10896 1726882178.15404: in VariableManager get_vars() 10896 1726882178.15421: done with get_vars() 10896 1726882178.15423: filtering new block on tags 10896 1726882178.15441: done filtering new block on tags 10896 1726882178.15443: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.0) 10896 1726882178.15447: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.15448: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.15450: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10896 1726882178.15550: in VariableManager get_vars() 10896 1726882178.15627: done with get_vars() 10896 1726882178.15868: done processing included file 10896 1726882178.15870: iterating over new_blocks loaded from include file 10896 1726882178.15871: in VariableManager get_vars() 10896 1726882178.15887: done with get_vars() 10896 1726882178.15889: filtering new block on tags 10896 1726882178.15910: done filtering new block on tags 10896 1726882178.15913: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.1) 10896 1726882178.15916: extending task lists for all hosts with included blocks 10896 1726882178.20585: done extending task lists 10896 1726882178.20598: done processing included files 10896 1726882178.20599: results queue empty 10896 1726882178.20600: checking for any_errors_fatal 10896 1726882178.20604: done checking for any_errors_fatal 10896 1726882178.20605: checking for max_fail_percentage 10896 1726882178.20606: done checking for max_fail_percentage 10896 1726882178.20607: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.20608: done checking to see if all hosts have failed 10896 1726882178.20608: getting the remaining hosts for this loop 10896 1726882178.20609: done getting the remaining hosts for this loop 10896 1726882178.20612: getting the next task for host managed_node2 10896 1726882178.20616: done getting next task for host managed_node2 10896 1726882178.20618: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10896 1726882178.20621: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.20623: getting variables 10896 1726882178.20624: in VariableManager get_vars() 10896 1726882178.20643: Calling all_inventory to load vars for managed_node2 10896 1726882178.20646: Calling groups_inventory to load vars for managed_node2 10896 1726882178.20648: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.20654: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.20657: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.20660: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.28644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.30113: done with get_vars() 10896 1726882178.30134: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:38 -0400 (0:00:00.212) 0:00:19.869 ****** 10896 1726882178.30212: entering _queue_task() for managed_node2/include_tasks 10896 1726882178.30552: worker is 1 (out of 1 available) 10896 1726882178.30563: exiting _queue_task() for managed_node2/include_tasks 10896 1726882178.30575: done queuing things up, now waiting for results queue to drain 10896 1726882178.30576: waiting for pending results... 10896 1726882178.30982: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 10896 1726882178.30987: in run() - task 12673a56-9f93-8b02-b216-000000000260 10896 1726882178.30991: variable 'ansible_search_path' from source: unknown 10896 1726882178.30995: variable 'ansible_search_path' from source: unknown 10896 1726882178.31077: calling self._execute() 10896 1726882178.31104: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.31110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.31118: variable 'omit' from source: magic vars 10896 1726882178.31501: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.31511: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.31518: _execute() done 10896 1726882178.31522: dumping result to json 10896 1726882178.31524: done dumping result, returning 10896 1726882178.31531: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-8b02-b216-000000000260] 10896 1726882178.31536: sending task result for task 12673a56-9f93-8b02-b216-000000000260 10896 1726882178.31681: done sending task result for task 12673a56-9f93-8b02-b216-000000000260 10896 1726882178.31684: WORKER PROCESS EXITING 10896 1726882178.31717: no more pending results, returning what we have 10896 1726882178.31723: in VariableManager get_vars() 10896 1726882178.31770: Calling all_inventory to load vars for managed_node2 10896 1726882178.31774: Calling groups_inventory to load vars for managed_node2 10896 1726882178.31777: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.31791: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.31799: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.31803: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.33172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.34653: done with get_vars() 10896 1726882178.34670: variable 'ansible_search_path' from source: unknown 10896 1726882178.34671: variable 'ansible_search_path' from source: unknown 10896 1726882178.34700: we have included files to process 10896 1726882178.34701: generating all_blocks data 10896 1726882178.34702: done generating all_blocks data 10896 1726882178.34703: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882178.34704: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882178.34706: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882178.35378: done processing included file 10896 1726882178.35379: iterating over new_blocks loaded from include file 10896 1726882178.35380: in VariableManager get_vars() 10896 1726882178.35396: done with get_vars() 10896 1726882178.35398: filtering new block on tags 10896 1726882178.35411: done filtering new block on tags 10896 1726882178.35413: in VariableManager get_vars() 10896 1726882178.35427: done with get_vars() 10896 1726882178.35428: filtering new block on tags 10896 1726882178.35441: done filtering new block on tags 10896 1726882178.35442: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 10896 1726882178.35445: extending task lists for all hosts with included blocks 10896 1726882178.35584: done extending task lists 10896 1726882178.35585: done processing included files 10896 1726882178.35586: results queue empty 10896 1726882178.35586: checking for any_errors_fatal 10896 1726882178.35589: done checking for any_errors_fatal 10896 1726882178.35589: checking for max_fail_percentage 10896 1726882178.35590: done checking for max_fail_percentage 10896 1726882178.35590: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.35591: done checking to see if all hosts have failed 10896 1726882178.35591: getting the remaining hosts for this loop 10896 1726882178.35592: done getting the remaining hosts for this loop 10896 1726882178.35597: getting the next task for host managed_node2 10896 1726882178.35600: done getting next task for host managed_node2 10896 1726882178.35601: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882178.35603: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.35604: getting variables 10896 1726882178.35605: in VariableManager get_vars() 10896 1726882178.35613: Calling all_inventory to load vars for managed_node2 10896 1726882178.35615: Calling groups_inventory to load vars for managed_node2 10896 1726882178.35616: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.35620: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.35622: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.35623: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.36226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.37330: done with get_vars() 10896 1726882178.37349: done getting variables 10896 1726882178.37464: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:38 -0400 (0:00:00.072) 0:00:19.941 ****** 10896 1726882178.37488: entering _queue_task() for managed_node2/set_fact 10896 1726882178.37792: worker is 1 (out of 1 available) 10896 1726882178.37805: exiting _queue_task() for managed_node2/set_fact 10896 1726882178.37819: done queuing things up, now waiting for results queue to drain 10896 1726882178.37820: waiting for pending results... 10896 1726882178.38075: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882178.38310: in run() - task 12673a56-9f93-8b02-b216-0000000003b3 10896 1726882178.38314: variable 'ansible_search_path' from source: unknown 10896 1726882178.38316: variable 'ansible_search_path' from source: unknown 10896 1726882178.38319: calling self._execute() 10896 1726882178.38374: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.38386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.38421: variable 'omit' from source: magic vars 10896 1726882178.38896: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.38908: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.38913: variable 'omit' from source: magic vars 10896 1726882178.38941: variable 'omit' from source: magic vars 10896 1726882178.38967: variable 'omit' from source: magic vars 10896 1726882178.38999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882178.39026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882178.39046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882178.39061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.39070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.39095: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882178.39100: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.39104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.39178: Set connection var ansible_connection to ssh 10896 1726882178.39181: Set connection var ansible_timeout to 10 10896 1726882178.39184: Set connection var ansible_shell_type to sh 10896 1726882178.39197: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882178.39201: Set connection var ansible_shell_executable to /bin/sh 10896 1726882178.39203: Set connection var ansible_pipelining to False 10896 1726882178.39222: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.39225: variable 'ansible_connection' from source: unknown 10896 1726882178.39228: variable 'ansible_module_compression' from source: unknown 10896 1726882178.39230: variable 'ansible_shell_type' from source: unknown 10896 1726882178.39233: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.39235: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.39237: variable 'ansible_pipelining' from source: unknown 10896 1726882178.39240: variable 'ansible_timeout' from source: unknown 10896 1726882178.39242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.39341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882178.39349: variable 'omit' from source: magic vars 10896 1726882178.39354: starting attempt loop 10896 1726882178.39357: running the handler 10896 1726882178.39410: handler run complete 10896 1726882178.39414: attempt loop complete, returning result 10896 1726882178.39416: _execute() done 10896 1726882178.39419: dumping result to json 10896 1726882178.39421: done dumping result, returning 10896 1726882178.39424: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-8b02-b216-0000000003b3] 10896 1726882178.39426: sending task result for task 12673a56-9f93-8b02-b216-0000000003b3 10896 1726882178.39481: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b3 10896 1726882178.39483: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10896 1726882178.39535: no more pending results, returning what we have 10896 1726882178.39538: results queue empty 10896 1726882178.39539: checking for any_errors_fatal 10896 1726882178.39540: done checking for any_errors_fatal 10896 1726882178.39541: checking for max_fail_percentage 10896 1726882178.39543: done checking for max_fail_percentage 10896 1726882178.39544: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.39544: done checking to see if all hosts have failed 10896 1726882178.39545: getting the remaining hosts for this loop 10896 1726882178.39546: done getting the remaining hosts for this loop 10896 1726882178.39549: getting the next task for host managed_node2 10896 1726882178.39555: done getting next task for host managed_node2 10896 1726882178.39558: ^ task is: TASK: Stat profile file 10896 1726882178.39562: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.39565: getting variables 10896 1726882178.39567: in VariableManager get_vars() 10896 1726882178.39606: Calling all_inventory to load vars for managed_node2 10896 1726882178.39609: Calling groups_inventory to load vars for managed_node2 10896 1726882178.39611: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.39619: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.39622: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.39624: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.40448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.41337: done with get_vars() 10896 1726882178.41355: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:38 -0400 (0:00:00.039) 0:00:19.981 ****** 10896 1726882178.41441: entering _queue_task() for managed_node2/stat 10896 1726882178.41713: worker is 1 (out of 1 available) 10896 1726882178.41724: exiting _queue_task() for managed_node2/stat 10896 1726882178.41735: done queuing things up, now waiting for results queue to drain 10896 1726882178.41736: waiting for pending results... 10896 1726882178.42138: running TaskExecutor() for managed_node2/TASK: Stat profile file 10896 1726882178.42143: in run() - task 12673a56-9f93-8b02-b216-0000000003b4 10896 1726882178.42147: variable 'ansible_search_path' from source: unknown 10896 1726882178.42149: variable 'ansible_search_path' from source: unknown 10896 1726882178.42153: calling self._execute() 10896 1726882178.42287: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.42291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.42296: variable 'omit' from source: magic vars 10896 1726882178.42566: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.42575: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.42581: variable 'omit' from source: magic vars 10896 1726882178.42613: variable 'omit' from source: magic vars 10896 1726882178.42683: variable 'profile' from source: include params 10896 1726882178.42687: variable 'item' from source: include params 10896 1726882178.42737: variable 'item' from source: include params 10896 1726882178.42752: variable 'omit' from source: magic vars 10896 1726882178.42786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882178.42820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882178.42836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882178.42849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.42858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.42884: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882178.42887: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.42890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.42958: Set connection var ansible_connection to ssh 10896 1726882178.42962: Set connection var ansible_timeout to 10 10896 1726882178.42965: Set connection var ansible_shell_type to sh 10896 1726882178.42974: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882178.42976: Set connection var ansible_shell_executable to /bin/sh 10896 1726882178.42990: Set connection var ansible_pipelining to False 10896 1726882178.43006: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.43009: variable 'ansible_connection' from source: unknown 10896 1726882178.43012: variable 'ansible_module_compression' from source: unknown 10896 1726882178.43014: variable 'ansible_shell_type' from source: unknown 10896 1726882178.43017: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.43020: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.43024: variable 'ansible_pipelining' from source: unknown 10896 1726882178.43027: variable 'ansible_timeout' from source: unknown 10896 1726882178.43030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.43174: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882178.43183: variable 'omit' from source: magic vars 10896 1726882178.43188: starting attempt loop 10896 1726882178.43191: running the handler 10896 1726882178.43210: _low_level_execute_command(): starting 10896 1726882178.43218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882178.43747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882178.43752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.43755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.43758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.43796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882178.43810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.43886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.45584: stdout chunk (state=3): >>>/root <<< 10896 1726882178.45683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.45692: stdout chunk (state=3): >>><<< 10896 1726882178.45715: stderr chunk (state=3): >>><<< 10896 1726882178.45736: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.45751: _low_level_execute_command(): starting 10896 1726882178.45754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833 `" && echo ansible-tmp-1726882178.4573615-11918-241682242636833="` echo /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833 `" ) && sleep 0' 10896 1726882178.46401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882178.46405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882178.46415: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.46508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.46536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.48426: stdout chunk (state=3): >>>ansible-tmp-1726882178.4573615-11918-241682242636833=/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833 <<< 10896 1726882178.48545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.48570: stderr chunk (state=3): >>><<< 10896 1726882178.48574: stdout chunk (state=3): >>><<< 10896 1726882178.48589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882178.4573615-11918-241682242636833=/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.48628: variable 'ansible_module_compression' from source: unknown 10896 1726882178.48682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882178.48712: variable 'ansible_facts' from source: unknown 10896 1726882178.48798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py 10896 1726882178.48945: Sending initial data 10896 1726882178.48972: Sent initial data (153 bytes) 10896 1726882178.49675: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882178.49678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.49681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.49683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.49771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.49827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.51333: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10896 1726882178.51337: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882178.51389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882178.51453: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpm68du0r8 /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py <<< 10896 1726882178.51457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py" <<< 10896 1726882178.51540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpm68du0r8" to remote "/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py" <<< 10896 1726882178.52322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.52368: stderr chunk (state=3): >>><<< 10896 1726882178.52371: stdout chunk (state=3): >>><<< 10896 1726882178.52407: done transferring module to remote 10896 1726882178.52417: _low_level_execute_command(): starting 10896 1726882178.52421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/ /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py && sleep 0' 10896 1726882178.52947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.52982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882178.52985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.52987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882178.52989: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.52991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.53063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.53116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.54833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.54873: stderr chunk (state=3): >>><<< 10896 1726882178.54879: stdout chunk (state=3): >>><<< 10896 1726882178.54916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.54919: _low_level_execute_command(): starting 10896 1726882178.54926: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/AnsiballZ_stat.py && sleep 0' 10896 1726882178.55402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.55407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.55410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882178.55423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.55469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882178.55473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.55541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.70469: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882178.71685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882178.71706: stderr chunk (state=3): >>><<< 10896 1726882178.71709: stdout chunk (state=3): >>><<< 10896 1726882178.71738: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882178.71775: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882178.71785: _low_level_execute_command(): starting 10896 1726882178.71788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882178.4573615-11918-241682242636833/ > /dev/null 2>&1 && sleep 0' 10896 1726882178.72349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.72421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882178.72425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882178.72428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.72474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.74440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.74470: stdout chunk (state=3): >>><<< 10896 1726882178.74473: stderr chunk (state=3): >>><<< 10896 1726882178.74699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.74703: handler run complete 10896 1726882178.74706: attempt loop complete, returning result 10896 1726882178.74708: _execute() done 10896 1726882178.74710: dumping result to json 10896 1726882178.74712: done dumping result, returning 10896 1726882178.74714: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-8b02-b216-0000000003b4] 10896 1726882178.74715: sending task result for task 12673a56-9f93-8b02-b216-0000000003b4 10896 1726882178.74790: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b4 10896 1726882178.74795: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 10896 1726882178.74874: no more pending results, returning what we have 10896 1726882178.74879: results queue empty 10896 1726882178.74880: checking for any_errors_fatal 10896 1726882178.74888: done checking for any_errors_fatal 10896 1726882178.74889: checking for max_fail_percentage 10896 1726882178.74891: done checking for max_fail_percentage 10896 1726882178.74892: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.74896: done checking to see if all hosts have failed 10896 1726882178.74897: getting the remaining hosts for this loop 10896 1726882178.74899: done getting the remaining hosts for this loop 10896 1726882178.74903: getting the next task for host managed_node2 10896 1726882178.74909: done getting next task for host managed_node2 10896 1726882178.74912: ^ task is: TASK: Set NM profile exist flag based on the profile files 10896 1726882178.74918: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.74922: getting variables 10896 1726882178.74924: in VariableManager get_vars() 10896 1726882178.74975: Calling all_inventory to load vars for managed_node2 10896 1726882178.74978: Calling groups_inventory to load vars for managed_node2 10896 1726882178.74981: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.75351: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.75356: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.75360: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.78182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.80427: done with get_vars() 10896 1726882178.80451: done getting variables 10896 1726882178.80524: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:38 -0400 (0:00:00.391) 0:00:20.372 ****** 10896 1726882178.80555: entering _queue_task() for managed_node2/set_fact 10896 1726882178.81026: worker is 1 (out of 1 available) 10896 1726882178.81037: exiting _queue_task() for managed_node2/set_fact 10896 1726882178.81050: done queuing things up, now waiting for results queue to drain 10896 1726882178.81052: waiting for pending results... 10896 1726882178.81290: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 10896 1726882178.81455: in run() - task 12673a56-9f93-8b02-b216-0000000003b5 10896 1726882178.81460: variable 'ansible_search_path' from source: unknown 10896 1726882178.81464: variable 'ansible_search_path' from source: unknown 10896 1726882178.81466: calling self._execute() 10896 1726882178.81569: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.81582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.81606: variable 'omit' from source: magic vars 10896 1726882178.82062: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.82079: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.82624: variable 'profile_stat' from source: set_fact 10896 1726882178.82627: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882178.82629: when evaluation is False, skipping this task 10896 1726882178.82631: _execute() done 10896 1726882178.82632: dumping result to json 10896 1726882178.82634: done dumping result, returning 10896 1726882178.82636: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-8b02-b216-0000000003b5] 10896 1726882178.82638: sending task result for task 12673a56-9f93-8b02-b216-0000000003b5 10896 1726882178.82702: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b5 10896 1726882178.82705: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882178.82775: no more pending results, returning what we have 10896 1726882178.82780: results queue empty 10896 1726882178.82781: checking for any_errors_fatal 10896 1726882178.82790: done checking for any_errors_fatal 10896 1726882178.82791: checking for max_fail_percentage 10896 1726882178.82795: done checking for max_fail_percentage 10896 1726882178.82796: checking to see if all hosts have failed and the running result is not ok 10896 1726882178.82896: done checking to see if all hosts have failed 10896 1726882178.82898: getting the remaining hosts for this loop 10896 1726882178.82899: done getting the remaining hosts for this loop 10896 1726882178.82903: getting the next task for host managed_node2 10896 1726882178.82910: done getting next task for host managed_node2 10896 1726882178.82912: ^ task is: TASK: Get NM profile info 10896 1726882178.82917: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882178.82922: getting variables 10896 1726882178.82923: in VariableManager get_vars() 10896 1726882178.82968: Calling all_inventory to load vars for managed_node2 10896 1726882178.82971: Calling groups_inventory to load vars for managed_node2 10896 1726882178.82974: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882178.82987: Calling all_plugins_play to load vars for managed_node2 10896 1726882178.82990: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882178.83310: Calling groups_plugins_play to load vars for managed_node2 10896 1726882178.85731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882178.87465: done with get_vars() 10896 1726882178.87498: done getting variables 10896 1726882178.87586: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:38 -0400 (0:00:00.070) 0:00:20.443 ****** 10896 1726882178.87623: entering _queue_task() for managed_node2/shell 10896 1726882178.88211: worker is 1 (out of 1 available) 10896 1726882178.88229: exiting _queue_task() for managed_node2/shell 10896 1726882178.88242: done queuing things up, now waiting for results queue to drain 10896 1726882178.88243: waiting for pending results... 10896 1726882178.88814: running TaskExecutor() for managed_node2/TASK: Get NM profile info 10896 1726882178.89206: in run() - task 12673a56-9f93-8b02-b216-0000000003b6 10896 1726882178.89211: variable 'ansible_search_path' from source: unknown 10896 1726882178.89214: variable 'ansible_search_path' from source: unknown 10896 1726882178.89217: calling self._execute() 10896 1726882178.89221: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.89224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.89227: variable 'omit' from source: magic vars 10896 1726882178.89820: variable 'ansible_distribution_major_version' from source: facts 10896 1726882178.89831: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882178.89837: variable 'omit' from source: magic vars 10896 1726882178.89879: variable 'omit' from source: magic vars 10896 1726882178.89977: variable 'profile' from source: include params 10896 1726882178.89980: variable 'item' from source: include params 10896 1726882178.90045: variable 'item' from source: include params 10896 1726882178.90061: variable 'omit' from source: magic vars 10896 1726882178.90102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882178.90135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882178.90152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882178.90168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.90179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882178.90210: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882178.90213: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.90216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.90334: Set connection var ansible_connection to ssh 10896 1726882178.90347: Set connection var ansible_timeout to 10 10896 1726882178.90354: Set connection var ansible_shell_type to sh 10896 1726882178.90366: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882178.90374: Set connection var ansible_shell_executable to /bin/sh 10896 1726882178.90384: Set connection var ansible_pipelining to False 10896 1726882178.90428: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.90441: variable 'ansible_connection' from source: unknown 10896 1726882178.90448: variable 'ansible_module_compression' from source: unknown 10896 1726882178.90454: variable 'ansible_shell_type' from source: unknown 10896 1726882178.90461: variable 'ansible_shell_executable' from source: unknown 10896 1726882178.90467: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882178.90474: variable 'ansible_pipelining' from source: unknown 10896 1726882178.90480: variable 'ansible_timeout' from source: unknown 10896 1726882178.90487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882178.90707: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882178.90745: variable 'omit' from source: magic vars 10896 1726882178.90767: starting attempt loop 10896 1726882178.90770: running the handler 10896 1726882178.90840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882178.90845: _low_level_execute_command(): starting 10896 1726882178.90847: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882178.91663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882178.91721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.91799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882178.91925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882178.92230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.92323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.93922: stdout chunk (state=3): >>>/root <<< 10896 1726882178.94074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.94077: stdout chunk (state=3): >>><<< 10896 1726882178.94079: stderr chunk (state=3): >>><<< 10896 1726882178.94106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.94203: _low_level_execute_command(): starting 10896 1726882178.94207: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900 `" && echo ansible-tmp-1726882178.941133-11943-280572066225900="` echo /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900 `" ) && sleep 0' 10896 1726882178.94739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882178.94765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882178.94782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882178.94809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882178.94868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882178.94921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882178.94935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882178.94974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.95029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.96905: stdout chunk (state=3): >>>ansible-tmp-1726882178.941133-11943-280572066225900=/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900 <<< 10896 1726882178.97052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882178.97055: stdout chunk (state=3): >>><<< 10896 1726882178.97058: stderr chunk (state=3): >>><<< 10896 1726882178.97299: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882178.941133-11943-280572066225900=/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882178.97302: variable 'ansible_module_compression' from source: unknown 10896 1726882178.97304: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882178.97307: variable 'ansible_facts' from source: unknown 10896 1726882178.97309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py 10896 1726882178.97444: Sending initial data 10896 1726882178.97454: Sent initial data (155 bytes) 10896 1726882178.98105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882178.98124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882178.98209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882178.99738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882178.99812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882178.99874: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp5umke3j3 /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py <<< 10896 1726882178.99877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py" <<< 10896 1726882178.99951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp5umke3j3" to remote "/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py" <<< 10896 1726882179.00816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.00828: stderr chunk (state=3): >>><<< 10896 1726882179.00839: stdout chunk (state=3): >>><<< 10896 1726882179.00900: done transferring module to remote 10896 1726882179.00903: _low_level_execute_command(): starting 10896 1726882179.00906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/ /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py && sleep 0' 10896 1726882179.01489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882179.01560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882179.01564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.01566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882179.01569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882179.01571: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882179.01576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.01652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.01842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.01847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.01889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.01965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.03744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.03748: stdout chunk (state=3): >>><<< 10896 1726882179.03750: stderr chunk (state=3): >>><<< 10896 1726882179.03799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882179.03803: _low_level_execute_command(): starting 10896 1726882179.03806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/AnsiballZ_command.py && sleep 0' 10896 1726882179.04460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882179.04499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882179.04513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.04576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.04623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.04642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.04672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.04768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.21984: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:39.194291", "end": "2024-09-20 21:29:39.218579", "delta": "0:00:00.024288", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882179.23486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882179.23490: stdout chunk (state=3): >>><<< 10896 1726882179.23497: stderr chunk (state=3): >>><<< 10896 1726882179.23662: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:39.194291", "end": "2024-09-20 21:29:39.218579", "delta": "0:00:00.024288", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882179.23667: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882179.23677: _low_level_execute_command(): starting 10896 1726882179.23679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882178.941133-11943-280572066225900/ > /dev/null 2>&1 && sleep 0' 10896 1726882179.24645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.24649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.24651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.24653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.24760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.24852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.24904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.26768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.26772: stdout chunk (state=3): >>><<< 10896 1726882179.26775: stderr chunk (state=3): >>><<< 10896 1726882179.27001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882179.27007: handler run complete 10896 1726882179.27011: Evaluated conditional (False): False 10896 1726882179.27014: attempt loop complete, returning result 10896 1726882179.27017: _execute() done 10896 1726882179.27020: dumping result to json 10896 1726882179.27022: done dumping result, returning 10896 1726882179.27025: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-8b02-b216-0000000003b6] 10896 1726882179.27028: sending task result for task 12673a56-9f93-8b02-b216-0000000003b6 10896 1726882179.27110: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b6 10896 1726882179.27114: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.024288", "end": "2024-09-20 21:29:39.218579", "rc": 0, "start": "2024-09-20 21:29:39.194291" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 10896 1726882179.27186: no more pending results, returning what we have 10896 1726882179.27190: results queue empty 10896 1726882179.27191: checking for any_errors_fatal 10896 1726882179.27200: done checking for any_errors_fatal 10896 1726882179.27201: checking for max_fail_percentage 10896 1726882179.27205: done checking for max_fail_percentage 10896 1726882179.27206: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.27207: done checking to see if all hosts have failed 10896 1726882179.27207: getting the remaining hosts for this loop 10896 1726882179.27209: done getting the remaining hosts for this loop 10896 1726882179.27212: getting the next task for host managed_node2 10896 1726882179.27218: done getting next task for host managed_node2 10896 1726882179.27221: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882179.27225: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.27228: getting variables 10896 1726882179.27230: in VariableManager get_vars() 10896 1726882179.27269: Calling all_inventory to load vars for managed_node2 10896 1726882179.27272: Calling groups_inventory to load vars for managed_node2 10896 1726882179.27274: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.27286: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.27289: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.27291: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.29087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.31039: done with get_vars() 10896 1726882179.31062: done getting variables 10896 1726882179.31143: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:39 -0400 (0:00:00.435) 0:00:20.878 ****** 10896 1726882179.31177: entering _queue_task() for managed_node2/set_fact 10896 1726882179.31534: worker is 1 (out of 1 available) 10896 1726882179.31551: exiting _queue_task() for managed_node2/set_fact 10896 1726882179.31563: done queuing things up, now waiting for results queue to drain 10896 1726882179.31564: waiting for pending results... 10896 1726882179.31747: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882179.31837: in run() - task 12673a56-9f93-8b02-b216-0000000003b7 10896 1726882179.31876: variable 'ansible_search_path' from source: unknown 10896 1726882179.31879: variable 'ansible_search_path' from source: unknown 10896 1726882179.31917: calling self._execute() 10896 1726882179.32014: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.32020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.32030: variable 'omit' from source: magic vars 10896 1726882179.32452: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.32464: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.32610: variable 'nm_profile_exists' from source: set_fact 10896 1726882179.32807: Evaluated conditional (nm_profile_exists.rc == 0): True 10896 1726882179.32811: variable 'omit' from source: magic vars 10896 1726882179.32814: variable 'omit' from source: magic vars 10896 1726882179.32816: variable 'omit' from source: magic vars 10896 1726882179.32818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.32822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.32825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.32838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.32849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.32889: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.32897: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.32900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.33015: Set connection var ansible_connection to ssh 10896 1726882179.33022: Set connection var ansible_timeout to 10 10896 1726882179.33025: Set connection var ansible_shell_type to sh 10896 1726882179.33033: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.33038: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.33044: Set connection var ansible_pipelining to False 10896 1726882179.33068: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.33072: variable 'ansible_connection' from source: unknown 10896 1726882179.33074: variable 'ansible_module_compression' from source: unknown 10896 1726882179.33077: variable 'ansible_shell_type' from source: unknown 10896 1726882179.33079: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.33082: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.33086: variable 'ansible_pipelining' from source: unknown 10896 1726882179.33106: variable 'ansible_timeout' from source: unknown 10896 1726882179.33109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.33275: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882179.33279: variable 'omit' from source: magic vars 10896 1726882179.33282: starting attempt loop 10896 1726882179.33284: running the handler 10896 1726882179.33288: handler run complete 10896 1726882179.33298: attempt loop complete, returning result 10896 1726882179.33301: _execute() done 10896 1726882179.33303: dumping result to json 10896 1726882179.33317: done dumping result, returning 10896 1726882179.33320: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-8b02-b216-0000000003b7] 10896 1726882179.33323: sending task result for task 12673a56-9f93-8b02-b216-0000000003b7 10896 1726882179.33398: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b7 10896 1726882179.33401: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10896 1726882179.33468: no more pending results, returning what we have 10896 1726882179.33471: results queue empty 10896 1726882179.33472: checking for any_errors_fatal 10896 1726882179.33487: done checking for any_errors_fatal 10896 1726882179.33487: checking for max_fail_percentage 10896 1726882179.33489: done checking for max_fail_percentage 10896 1726882179.33490: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.33491: done checking to see if all hosts have failed 10896 1726882179.33491: getting the remaining hosts for this loop 10896 1726882179.33495: done getting the remaining hosts for this loop 10896 1726882179.33499: getting the next task for host managed_node2 10896 1726882179.33506: done getting next task for host managed_node2 10896 1726882179.33509: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882179.33512: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.33515: getting variables 10896 1726882179.33517: in VariableManager get_vars() 10896 1726882179.33570: Calling all_inventory to load vars for managed_node2 10896 1726882179.33573: Calling groups_inventory to load vars for managed_node2 10896 1726882179.33575: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.33584: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.33586: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.33588: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.34375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.35247: done with get_vars() 10896 1726882179.35265: done getting variables 10896 1726882179.35310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.35413: variable 'profile' from source: include params 10896 1726882179.35417: variable 'item' from source: include params 10896 1726882179.35470: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:39 -0400 (0:00:00.043) 0:00:20.922 ****** 10896 1726882179.35510: entering _queue_task() for managed_node2/command 10896 1726882179.36023: worker is 1 (out of 1 available) 10896 1726882179.36031: exiting _queue_task() for managed_node2/command 10896 1726882179.36041: done queuing things up, now waiting for results queue to drain 10896 1726882179.36042: waiting for pending results... 10896 1726882179.36129: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 10896 1726882179.36213: in run() - task 12673a56-9f93-8b02-b216-0000000003b9 10896 1726882179.36241: variable 'ansible_search_path' from source: unknown 10896 1726882179.36262: variable 'ansible_search_path' from source: unknown 10896 1726882179.36271: calling self._execute() 10896 1726882179.36360: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.36366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.36385: variable 'omit' from source: magic vars 10896 1726882179.36904: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.36908: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.36910: variable 'profile_stat' from source: set_fact 10896 1726882179.36914: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882179.36921: when evaluation is False, skipping this task 10896 1726882179.36927: _execute() done 10896 1726882179.36933: dumping result to json 10896 1726882179.36939: done dumping result, returning 10896 1726882179.36949: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-8b02-b216-0000000003b9] 10896 1726882179.36958: sending task result for task 12673a56-9f93-8b02-b216-0000000003b9 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882179.37100: no more pending results, returning what we have 10896 1726882179.37103: results queue empty 10896 1726882179.37105: checking for any_errors_fatal 10896 1726882179.37111: done checking for any_errors_fatal 10896 1726882179.37111: checking for max_fail_percentage 10896 1726882179.37113: done checking for max_fail_percentage 10896 1726882179.37114: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.37115: done checking to see if all hosts have failed 10896 1726882179.37116: getting the remaining hosts for this loop 10896 1726882179.37117: done getting the remaining hosts for this loop 10896 1726882179.37120: getting the next task for host managed_node2 10896 1726882179.37128: done getting next task for host managed_node2 10896 1726882179.37131: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882179.37135: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.37140: getting variables 10896 1726882179.37141: in VariableManager get_vars() 10896 1726882179.37183: Calling all_inventory to load vars for managed_node2 10896 1726882179.37185: Calling groups_inventory to load vars for managed_node2 10896 1726882179.37188: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.37209: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.37213: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.37217: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.38012: done sending task result for task 12673a56-9f93-8b02-b216-0000000003b9 10896 1726882179.38016: WORKER PROCESS EXITING 10896 1726882179.38349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.39202: done with get_vars() 10896 1726882179.39217: done getting variables 10896 1726882179.39261: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.39347: variable 'profile' from source: include params 10896 1726882179.39350: variable 'item' from source: include params 10896 1726882179.39389: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:39 -0400 (0:00:00.039) 0:00:20.961 ****** 10896 1726882179.39415: entering _queue_task() for managed_node2/set_fact 10896 1726882179.39657: worker is 1 (out of 1 available) 10896 1726882179.39670: exiting _queue_task() for managed_node2/set_fact 10896 1726882179.39683: done queuing things up, now waiting for results queue to drain 10896 1726882179.39684: waiting for pending results... 10896 1726882179.39872: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 10896 1726882179.40016: in run() - task 12673a56-9f93-8b02-b216-0000000003ba 10896 1726882179.40021: variable 'ansible_search_path' from source: unknown 10896 1726882179.40025: variable 'ansible_search_path' from source: unknown 10896 1726882179.40028: calling self._execute() 10896 1726882179.40118: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.40155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.40300: variable 'omit' from source: magic vars 10896 1726882179.40556: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.40573: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.40712: variable 'profile_stat' from source: set_fact 10896 1726882179.40736: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882179.40754: when evaluation is False, skipping this task 10896 1726882179.40760: _execute() done 10896 1726882179.40763: dumping result to json 10896 1726882179.40766: done dumping result, returning 10896 1726882179.40768: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-8b02-b216-0000000003ba] 10896 1726882179.40797: sending task result for task 12673a56-9f93-8b02-b216-0000000003ba skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882179.40914: no more pending results, returning what we have 10896 1726882179.40917: results queue empty 10896 1726882179.40918: checking for any_errors_fatal 10896 1726882179.40925: done checking for any_errors_fatal 10896 1726882179.40926: checking for max_fail_percentage 10896 1726882179.40927: done checking for max_fail_percentage 10896 1726882179.40928: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.40929: done checking to see if all hosts have failed 10896 1726882179.40929: getting the remaining hosts for this loop 10896 1726882179.40931: done getting the remaining hosts for this loop 10896 1726882179.40934: getting the next task for host managed_node2 10896 1726882179.40940: done getting next task for host managed_node2 10896 1726882179.40943: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10896 1726882179.40947: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.40951: getting variables 10896 1726882179.40953: in VariableManager get_vars() 10896 1726882179.41006: Calling all_inventory to load vars for managed_node2 10896 1726882179.41009: Calling groups_inventory to load vars for managed_node2 10896 1726882179.41011: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.41021: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.41023: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.41026: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.41607: done sending task result for task 12673a56-9f93-8b02-b216-0000000003ba 10896 1726882179.41612: WORKER PROCESS EXITING 10896 1726882179.41808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.42659: done with get_vars() 10896 1726882179.42673: done getting variables 10896 1726882179.42720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.42792: variable 'profile' from source: include params 10896 1726882179.42797: variable 'item' from source: include params 10896 1726882179.42840: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:39 -0400 (0:00:00.034) 0:00:20.995 ****** 10896 1726882179.42861: entering _queue_task() for managed_node2/command 10896 1726882179.43076: worker is 1 (out of 1 available) 10896 1726882179.43089: exiting _queue_task() for managed_node2/command 10896 1726882179.43102: done queuing things up, now waiting for results queue to drain 10896 1726882179.43104: waiting for pending results... 10896 1726882179.43271: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 10896 1726882179.43359: in run() - task 12673a56-9f93-8b02-b216-0000000003bb 10896 1726882179.43366: variable 'ansible_search_path' from source: unknown 10896 1726882179.43370: variable 'ansible_search_path' from source: unknown 10896 1726882179.43397: calling self._execute() 10896 1726882179.43472: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.43476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.43481: variable 'omit' from source: magic vars 10896 1726882179.43741: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.43750: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.43835: variable 'profile_stat' from source: set_fact 10896 1726882179.43846: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882179.43849: when evaluation is False, skipping this task 10896 1726882179.43852: _execute() done 10896 1726882179.43855: dumping result to json 10896 1726882179.43857: done dumping result, returning 10896 1726882179.43862: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 [12673a56-9f93-8b02-b216-0000000003bb] 10896 1726882179.43869: sending task result for task 12673a56-9f93-8b02-b216-0000000003bb 10896 1726882179.43944: done sending task result for task 12673a56-9f93-8b02-b216-0000000003bb 10896 1726882179.43947: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882179.44020: no more pending results, returning what we have 10896 1726882179.44024: results queue empty 10896 1726882179.44025: checking for any_errors_fatal 10896 1726882179.44030: done checking for any_errors_fatal 10896 1726882179.44031: checking for max_fail_percentage 10896 1726882179.44032: done checking for max_fail_percentage 10896 1726882179.44033: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.44034: done checking to see if all hosts have failed 10896 1726882179.44035: getting the remaining hosts for this loop 10896 1726882179.44036: done getting the remaining hosts for this loop 10896 1726882179.44039: getting the next task for host managed_node2 10896 1726882179.44045: done getting next task for host managed_node2 10896 1726882179.44047: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10896 1726882179.44050: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.44053: getting variables 10896 1726882179.44054: in VariableManager get_vars() 10896 1726882179.44089: Calling all_inventory to load vars for managed_node2 10896 1726882179.44091: Calling groups_inventory to load vars for managed_node2 10896 1726882179.44095: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.44104: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.44107: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.44109: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.44953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.45816: done with get_vars() 10896 1726882179.45831: done getting variables 10896 1726882179.45873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.45952: variable 'profile' from source: include params 10896 1726882179.45955: variable 'item' from source: include params 10896 1726882179.45998: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:39 -0400 (0:00:00.031) 0:00:21.027 ****** 10896 1726882179.46020: entering _queue_task() for managed_node2/set_fact 10896 1726882179.46249: worker is 1 (out of 1 available) 10896 1726882179.46260: exiting _queue_task() for managed_node2/set_fact 10896 1726882179.46274: done queuing things up, now waiting for results queue to drain 10896 1726882179.46276: waiting for pending results... 10896 1726882179.46447: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 10896 1726882179.46522: in run() - task 12673a56-9f93-8b02-b216-0000000003bc 10896 1726882179.46534: variable 'ansible_search_path' from source: unknown 10896 1726882179.46538: variable 'ansible_search_path' from source: unknown 10896 1726882179.46565: calling self._execute() 10896 1726882179.46642: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.46646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.46650: variable 'omit' from source: magic vars 10896 1726882179.46907: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.46916: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.46999: variable 'profile_stat' from source: set_fact 10896 1726882179.47010: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882179.47013: when evaluation is False, skipping this task 10896 1726882179.47016: _execute() done 10896 1726882179.47019: dumping result to json 10896 1726882179.47021: done dumping result, returning 10896 1726882179.47028: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [12673a56-9f93-8b02-b216-0000000003bc] 10896 1726882179.47032: sending task result for task 12673a56-9f93-8b02-b216-0000000003bc 10896 1726882179.47121: done sending task result for task 12673a56-9f93-8b02-b216-0000000003bc 10896 1726882179.47123: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882179.47188: no more pending results, returning what we have 10896 1726882179.47191: results queue empty 10896 1726882179.47191: checking for any_errors_fatal 10896 1726882179.47201: done checking for any_errors_fatal 10896 1726882179.47201: checking for max_fail_percentage 10896 1726882179.47203: done checking for max_fail_percentage 10896 1726882179.47205: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.47206: done checking to see if all hosts have failed 10896 1726882179.47206: getting the remaining hosts for this loop 10896 1726882179.47208: done getting the remaining hosts for this loop 10896 1726882179.47211: getting the next task for host managed_node2 10896 1726882179.47218: done getting next task for host managed_node2 10896 1726882179.47221: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10896 1726882179.47224: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.47228: getting variables 10896 1726882179.47229: in VariableManager get_vars() 10896 1726882179.47261: Calling all_inventory to load vars for managed_node2 10896 1726882179.47263: Calling groups_inventory to load vars for managed_node2 10896 1726882179.47265: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.47275: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.47277: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.47279: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.48018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.48870: done with get_vars() 10896 1726882179.48885: done getting variables 10896 1726882179.48927: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.49006: variable 'profile' from source: include params 10896 1726882179.49009: variable 'item' from source: include params 10896 1726882179.49045: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:39 -0400 (0:00:00.030) 0:00:21.057 ****** 10896 1726882179.49069: entering _queue_task() for managed_node2/assert 10896 1726882179.49285: worker is 1 (out of 1 available) 10896 1726882179.49300: exiting _queue_task() for managed_node2/assert 10896 1726882179.49313: done queuing things up, now waiting for results queue to drain 10896 1726882179.49314: waiting for pending results... 10896 1726882179.49479: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' 10896 1726882179.49549: in run() - task 12673a56-9f93-8b02-b216-000000000261 10896 1726882179.49558: variable 'ansible_search_path' from source: unknown 10896 1726882179.49562: variable 'ansible_search_path' from source: unknown 10896 1726882179.49590: calling self._execute() 10896 1726882179.49664: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.49668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.49677: variable 'omit' from source: magic vars 10896 1726882179.49938: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.49947: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.49952: variable 'omit' from source: magic vars 10896 1726882179.49980: variable 'omit' from source: magic vars 10896 1726882179.50051: variable 'profile' from source: include params 10896 1726882179.50054: variable 'item' from source: include params 10896 1726882179.50103: variable 'item' from source: include params 10896 1726882179.50117: variable 'omit' from source: magic vars 10896 1726882179.50149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.50174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.50190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.50209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.50218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.50241: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.50244: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.50246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.50320: Set connection var ansible_connection to ssh 10896 1726882179.50325: Set connection var ansible_timeout to 10 10896 1726882179.50328: Set connection var ansible_shell_type to sh 10896 1726882179.50334: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.50339: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.50344: Set connection var ansible_pipelining to False 10896 1726882179.50364: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.50367: variable 'ansible_connection' from source: unknown 10896 1726882179.50369: variable 'ansible_module_compression' from source: unknown 10896 1726882179.50371: variable 'ansible_shell_type' from source: unknown 10896 1726882179.50373: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.50375: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.50380: variable 'ansible_pipelining' from source: unknown 10896 1726882179.50382: variable 'ansible_timeout' from source: unknown 10896 1726882179.50386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.50485: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882179.50496: variable 'omit' from source: magic vars 10896 1726882179.50503: starting attempt loop 10896 1726882179.50506: running the handler 10896 1726882179.50580: variable 'lsr_net_profile_exists' from source: set_fact 10896 1726882179.50584: Evaluated conditional (lsr_net_profile_exists): True 10896 1726882179.50590: handler run complete 10896 1726882179.50605: attempt loop complete, returning result 10896 1726882179.50608: _execute() done 10896 1726882179.50611: dumping result to json 10896 1726882179.50613: done dumping result, returning 10896 1726882179.50620: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' [12673a56-9f93-8b02-b216-000000000261] 10896 1726882179.50624: sending task result for task 12673a56-9f93-8b02-b216-000000000261 10896 1726882179.50699: done sending task result for task 12673a56-9f93-8b02-b216-000000000261 10896 1726882179.50702: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882179.50787: no more pending results, returning what we have 10896 1726882179.50789: results queue empty 10896 1726882179.50790: checking for any_errors_fatal 10896 1726882179.50796: done checking for any_errors_fatal 10896 1726882179.50797: checking for max_fail_percentage 10896 1726882179.50798: done checking for max_fail_percentage 10896 1726882179.50799: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.50800: done checking to see if all hosts have failed 10896 1726882179.50801: getting the remaining hosts for this loop 10896 1726882179.50802: done getting the remaining hosts for this loop 10896 1726882179.50805: getting the next task for host managed_node2 10896 1726882179.50809: done getting next task for host managed_node2 10896 1726882179.50811: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10896 1726882179.50814: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.50817: getting variables 10896 1726882179.50818: in VariableManager get_vars() 10896 1726882179.50852: Calling all_inventory to load vars for managed_node2 10896 1726882179.50855: Calling groups_inventory to load vars for managed_node2 10896 1726882179.50857: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.50865: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.50867: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.50870: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.51698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.52541: done with get_vars() 10896 1726882179.52554: done getting variables 10896 1726882179.52594: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.52669: variable 'profile' from source: include params 10896 1726882179.52671: variable 'item' from source: include params 10896 1726882179.52710: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:39 -0400 (0:00:00.036) 0:00:21.094 ****** 10896 1726882179.52735: entering _queue_task() for managed_node2/assert 10896 1726882179.52937: worker is 1 (out of 1 available) 10896 1726882179.52950: exiting _queue_task() for managed_node2/assert 10896 1726882179.52962: done queuing things up, now waiting for results queue to drain 10896 1726882179.52964: waiting for pending results... 10896 1726882179.53126: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' 10896 1726882179.53200: in run() - task 12673a56-9f93-8b02-b216-000000000262 10896 1726882179.53212: variable 'ansible_search_path' from source: unknown 10896 1726882179.53216: variable 'ansible_search_path' from source: unknown 10896 1726882179.53244: calling self._execute() 10896 1726882179.53317: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.53323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.53330: variable 'omit' from source: magic vars 10896 1726882179.53583: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.53592: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.53600: variable 'omit' from source: magic vars 10896 1726882179.53630: variable 'omit' from source: magic vars 10896 1726882179.53698: variable 'profile' from source: include params 10896 1726882179.53704: variable 'item' from source: include params 10896 1726882179.53752: variable 'item' from source: include params 10896 1726882179.53765: variable 'omit' from source: magic vars 10896 1726882179.53797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.53824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.53840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.53858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.53863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.53885: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.53889: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.53891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.53961: Set connection var ansible_connection to ssh 10896 1726882179.53964: Set connection var ansible_timeout to 10 10896 1726882179.53967: Set connection var ansible_shell_type to sh 10896 1726882179.53976: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.53980: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.53985: Set connection var ansible_pipelining to False 10896 1726882179.54007: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.54010: variable 'ansible_connection' from source: unknown 10896 1726882179.54013: variable 'ansible_module_compression' from source: unknown 10896 1726882179.54015: variable 'ansible_shell_type' from source: unknown 10896 1726882179.54017: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.54020: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.54024: variable 'ansible_pipelining' from source: unknown 10896 1726882179.54027: variable 'ansible_timeout' from source: unknown 10896 1726882179.54031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.54129: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882179.54137: variable 'omit' from source: magic vars 10896 1726882179.54142: starting attempt loop 10896 1726882179.54145: running the handler 10896 1726882179.54222: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10896 1726882179.54226: Evaluated conditional (lsr_net_profile_ansible_managed): True 10896 1726882179.54231: handler run complete 10896 1726882179.54241: attempt loop complete, returning result 10896 1726882179.54244: _execute() done 10896 1726882179.54247: dumping result to json 10896 1726882179.54249: done dumping result, returning 10896 1726882179.54255: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' [12673a56-9f93-8b02-b216-000000000262] 10896 1726882179.54260: sending task result for task 12673a56-9f93-8b02-b216-000000000262 10896 1726882179.54333: done sending task result for task 12673a56-9f93-8b02-b216-000000000262 10896 1726882179.54336: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882179.54378: no more pending results, returning what we have 10896 1726882179.54381: results queue empty 10896 1726882179.54381: checking for any_errors_fatal 10896 1726882179.54386: done checking for any_errors_fatal 10896 1726882179.54387: checking for max_fail_percentage 10896 1726882179.54388: done checking for max_fail_percentage 10896 1726882179.54389: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.54390: done checking to see if all hosts have failed 10896 1726882179.54391: getting the remaining hosts for this loop 10896 1726882179.54392: done getting the remaining hosts for this loop 10896 1726882179.54397: getting the next task for host managed_node2 10896 1726882179.54403: done getting next task for host managed_node2 10896 1726882179.54405: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10896 1726882179.54408: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.54412: getting variables 10896 1726882179.54413: in VariableManager get_vars() 10896 1726882179.54448: Calling all_inventory to load vars for managed_node2 10896 1726882179.54450: Calling groups_inventory to load vars for managed_node2 10896 1726882179.54453: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.54461: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.54463: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.54466: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.55210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.56147: done with get_vars() 10896 1726882179.56161: done getting variables 10896 1726882179.56202: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882179.56275: variable 'profile' from source: include params 10896 1726882179.56278: variable 'item' from source: include params 10896 1726882179.56318: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:39 -0400 (0:00:00.036) 0:00:21.130 ****** 10896 1726882179.56345: entering _queue_task() for managed_node2/assert 10896 1726882179.56549: worker is 1 (out of 1 available) 10896 1726882179.56561: exiting _queue_task() for managed_node2/assert 10896 1726882179.56573: done queuing things up, now waiting for results queue to drain 10896 1726882179.56574: waiting for pending results... 10896 1726882179.56743: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 10896 1726882179.56815: in run() - task 12673a56-9f93-8b02-b216-000000000263 10896 1726882179.56823: variable 'ansible_search_path' from source: unknown 10896 1726882179.56826: variable 'ansible_search_path' from source: unknown 10896 1726882179.56853: calling self._execute() 10896 1726882179.56926: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.56930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.56940: variable 'omit' from source: magic vars 10896 1726882179.57188: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.57199: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.57205: variable 'omit' from source: magic vars 10896 1726882179.57232: variable 'omit' from source: magic vars 10896 1726882179.57302: variable 'profile' from source: include params 10896 1726882179.57307: variable 'item' from source: include params 10896 1726882179.57354: variable 'item' from source: include params 10896 1726882179.57368: variable 'omit' from source: magic vars 10896 1726882179.57400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.57426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.57441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.57458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.57470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.57488: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.57491: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.57499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.57566: Set connection var ansible_connection to ssh 10896 1726882179.57576: Set connection var ansible_timeout to 10 10896 1726882179.57578: Set connection var ansible_shell_type to sh 10896 1726882179.57581: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.57584: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.57586: Set connection var ansible_pipelining to False 10896 1726882179.57607: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.57611: variable 'ansible_connection' from source: unknown 10896 1726882179.57613: variable 'ansible_module_compression' from source: unknown 10896 1726882179.57616: variable 'ansible_shell_type' from source: unknown 10896 1726882179.57618: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.57622: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.57624: variable 'ansible_pipelining' from source: unknown 10896 1726882179.57627: variable 'ansible_timeout' from source: unknown 10896 1726882179.57629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.57730: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882179.57738: variable 'omit' from source: magic vars 10896 1726882179.57743: starting attempt loop 10896 1726882179.57746: running the handler 10896 1726882179.57821: variable 'lsr_net_profile_fingerprint' from source: set_fact 10896 1726882179.57825: Evaluated conditional (lsr_net_profile_fingerprint): True 10896 1726882179.57830: handler run complete 10896 1726882179.57841: attempt loop complete, returning result 10896 1726882179.57844: _execute() done 10896 1726882179.57847: dumping result to json 10896 1726882179.57850: done dumping result, returning 10896 1726882179.57856: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 [12673a56-9f93-8b02-b216-000000000263] 10896 1726882179.57861: sending task result for task 12673a56-9f93-8b02-b216-000000000263 10896 1726882179.57942: done sending task result for task 12673a56-9f93-8b02-b216-000000000263 10896 1726882179.57945: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882179.57989: no more pending results, returning what we have 10896 1726882179.57992: results queue empty 10896 1726882179.57996: checking for any_errors_fatal 10896 1726882179.58002: done checking for any_errors_fatal 10896 1726882179.58003: checking for max_fail_percentage 10896 1726882179.58010: done checking for max_fail_percentage 10896 1726882179.58012: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.58013: done checking to see if all hosts have failed 10896 1726882179.58013: getting the remaining hosts for this loop 10896 1726882179.58015: done getting the remaining hosts for this loop 10896 1726882179.58018: getting the next task for host managed_node2 10896 1726882179.58027: done getting next task for host managed_node2 10896 1726882179.58029: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10896 1726882179.58032: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.58035: getting variables 10896 1726882179.58036: in VariableManager get_vars() 10896 1726882179.58073: Calling all_inventory to load vars for managed_node2 10896 1726882179.58075: Calling groups_inventory to load vars for managed_node2 10896 1726882179.58078: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.58086: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.58088: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.58091: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.58826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.59683: done with get_vars() 10896 1726882179.59703: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:39 -0400 (0:00:00.034) 0:00:21.164 ****** 10896 1726882179.59770: entering _queue_task() for managed_node2/include_tasks 10896 1726882179.60015: worker is 1 (out of 1 available) 10896 1726882179.60030: exiting _queue_task() for managed_node2/include_tasks 10896 1726882179.60043: done queuing things up, now waiting for results queue to drain 10896 1726882179.60044: waiting for pending results... 10896 1726882179.60214: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 10896 1726882179.60291: in run() - task 12673a56-9f93-8b02-b216-000000000267 10896 1726882179.60303: variable 'ansible_search_path' from source: unknown 10896 1726882179.60307: variable 'ansible_search_path' from source: unknown 10896 1726882179.60333: calling self._execute() 10896 1726882179.60406: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.60412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.60422: variable 'omit' from source: magic vars 10896 1726882179.60680: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.60689: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.60698: _execute() done 10896 1726882179.60701: dumping result to json 10896 1726882179.60705: done dumping result, returning 10896 1726882179.60708: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-8b02-b216-000000000267] 10896 1726882179.60721: sending task result for task 12673a56-9f93-8b02-b216-000000000267 10896 1726882179.60800: done sending task result for task 12673a56-9f93-8b02-b216-000000000267 10896 1726882179.60803: WORKER PROCESS EXITING 10896 1726882179.60846: no more pending results, returning what we have 10896 1726882179.60851: in VariableManager get_vars() 10896 1726882179.60900: Calling all_inventory to load vars for managed_node2 10896 1726882179.60903: Calling groups_inventory to load vars for managed_node2 10896 1726882179.60906: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.60916: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.60918: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.60920: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.61791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.62630: done with get_vars() 10896 1726882179.62643: variable 'ansible_search_path' from source: unknown 10896 1726882179.62644: variable 'ansible_search_path' from source: unknown 10896 1726882179.62668: we have included files to process 10896 1726882179.62669: generating all_blocks data 10896 1726882179.62670: done generating all_blocks data 10896 1726882179.62673: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882179.62674: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882179.62675: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882179.63263: done processing included file 10896 1726882179.63265: iterating over new_blocks loaded from include file 10896 1726882179.63266: in VariableManager get_vars() 10896 1726882179.63282: done with get_vars() 10896 1726882179.63283: filtering new block on tags 10896 1726882179.63300: done filtering new block on tags 10896 1726882179.63303: in VariableManager get_vars() 10896 1726882179.63314: done with get_vars() 10896 1726882179.63315: filtering new block on tags 10896 1726882179.63327: done filtering new block on tags 10896 1726882179.63329: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 10896 1726882179.63332: extending task lists for all hosts with included blocks 10896 1726882179.63433: done extending task lists 10896 1726882179.63434: done processing included files 10896 1726882179.63435: results queue empty 10896 1726882179.63435: checking for any_errors_fatal 10896 1726882179.63437: done checking for any_errors_fatal 10896 1726882179.63438: checking for max_fail_percentage 10896 1726882179.63438: done checking for max_fail_percentage 10896 1726882179.63439: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.63440: done checking to see if all hosts have failed 10896 1726882179.63440: getting the remaining hosts for this loop 10896 1726882179.63441: done getting the remaining hosts for this loop 10896 1726882179.63442: getting the next task for host managed_node2 10896 1726882179.63444: done getting next task for host managed_node2 10896 1726882179.63446: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882179.63448: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.63449: getting variables 10896 1726882179.63450: in VariableManager get_vars() 10896 1726882179.63458: Calling all_inventory to load vars for managed_node2 10896 1726882179.63460: Calling groups_inventory to load vars for managed_node2 10896 1726882179.63461: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.63465: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.63466: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.63468: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.64097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.64923: done with get_vars() 10896 1726882179.64936: done getting variables 10896 1726882179.64961: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:39 -0400 (0:00:00.052) 0:00:21.216 ****** 10896 1726882179.64981: entering _queue_task() for managed_node2/set_fact 10896 1726882179.65220: worker is 1 (out of 1 available) 10896 1726882179.65232: exiting _queue_task() for managed_node2/set_fact 10896 1726882179.65245: done queuing things up, now waiting for results queue to drain 10896 1726882179.65246: waiting for pending results... 10896 1726882179.65416: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882179.65480: in run() - task 12673a56-9f93-8b02-b216-0000000003fb 10896 1726882179.65491: variable 'ansible_search_path' from source: unknown 10896 1726882179.65496: variable 'ansible_search_path' from source: unknown 10896 1726882179.65523: calling self._execute() 10896 1726882179.65597: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.65607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.65615: variable 'omit' from source: magic vars 10896 1726882179.65897: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.65912: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.65916: variable 'omit' from source: magic vars 10896 1726882179.65947: variable 'omit' from source: magic vars 10896 1726882179.65970: variable 'omit' from source: magic vars 10896 1726882179.66004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.66035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.66051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.66064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.66073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.66101: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.66104: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.66106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.66176: Set connection var ansible_connection to ssh 10896 1726882179.66179: Set connection var ansible_timeout to 10 10896 1726882179.66182: Set connection var ansible_shell_type to sh 10896 1726882179.66189: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.66195: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.66202: Set connection var ansible_pipelining to False 10896 1726882179.66220: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.66223: variable 'ansible_connection' from source: unknown 10896 1726882179.66226: variable 'ansible_module_compression' from source: unknown 10896 1726882179.66230: variable 'ansible_shell_type' from source: unknown 10896 1726882179.66232: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.66235: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.66237: variable 'ansible_pipelining' from source: unknown 10896 1726882179.66239: variable 'ansible_timeout' from source: unknown 10896 1726882179.66243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.66342: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882179.66359: variable 'omit' from source: magic vars 10896 1726882179.66368: starting attempt loop 10896 1726882179.66371: running the handler 10896 1726882179.66374: handler run complete 10896 1726882179.66379: attempt loop complete, returning result 10896 1726882179.66382: _execute() done 10896 1726882179.66384: dumping result to json 10896 1726882179.66386: done dumping result, returning 10896 1726882179.66395: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-8b02-b216-0000000003fb] 10896 1726882179.66401: sending task result for task 12673a56-9f93-8b02-b216-0000000003fb 10896 1726882179.66473: done sending task result for task 12673a56-9f93-8b02-b216-0000000003fb 10896 1726882179.66476: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10896 1726882179.66529: no more pending results, returning what we have 10896 1726882179.66532: results queue empty 10896 1726882179.66533: checking for any_errors_fatal 10896 1726882179.66535: done checking for any_errors_fatal 10896 1726882179.66536: checking for max_fail_percentage 10896 1726882179.66537: done checking for max_fail_percentage 10896 1726882179.66538: checking to see if all hosts have failed and the running result is not ok 10896 1726882179.66539: done checking to see if all hosts have failed 10896 1726882179.66539: getting the remaining hosts for this loop 10896 1726882179.66541: done getting the remaining hosts for this loop 10896 1726882179.66543: getting the next task for host managed_node2 10896 1726882179.66550: done getting next task for host managed_node2 10896 1726882179.66552: ^ task is: TASK: Stat profile file 10896 1726882179.66556: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882179.66560: getting variables 10896 1726882179.66561: in VariableManager get_vars() 10896 1726882179.66606: Calling all_inventory to load vars for managed_node2 10896 1726882179.66609: Calling groups_inventory to load vars for managed_node2 10896 1726882179.66611: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882179.66619: Calling all_plugins_play to load vars for managed_node2 10896 1726882179.66621: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882179.66624: Calling groups_plugins_play to load vars for managed_node2 10896 1726882179.67402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882179.68458: done with get_vars() 10896 1726882179.68479: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:39 -0400 (0:00:00.035) 0:00:21.252 ****** 10896 1726882179.68571: entering _queue_task() for managed_node2/stat 10896 1726882179.68862: worker is 1 (out of 1 available) 10896 1726882179.68875: exiting _queue_task() for managed_node2/stat 10896 1726882179.68888: done queuing things up, now waiting for results queue to drain 10896 1726882179.68889: waiting for pending results... 10896 1726882179.69297: running TaskExecutor() for managed_node2/TASK: Stat profile file 10896 1726882179.69303: in run() - task 12673a56-9f93-8b02-b216-0000000003fc 10896 1726882179.69321: variable 'ansible_search_path' from source: unknown 10896 1726882179.69326: variable 'ansible_search_path' from source: unknown 10896 1726882179.69402: calling self._execute() 10896 1726882179.69452: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.69457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.69468: variable 'omit' from source: magic vars 10896 1726882179.69815: variable 'ansible_distribution_major_version' from source: facts 10896 1726882179.69829: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882179.69861: variable 'omit' from source: magic vars 10896 1726882179.69875: variable 'omit' from source: magic vars 10896 1726882179.69972: variable 'profile' from source: include params 10896 1726882179.69976: variable 'item' from source: include params 10896 1726882179.70081: variable 'item' from source: include params 10896 1726882179.70085: variable 'omit' from source: magic vars 10896 1726882179.70096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882179.70132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882179.70153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882179.70172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.70189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882179.70215: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882179.70218: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.70221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.70320: Set connection var ansible_connection to ssh 10896 1726882179.70326: Set connection var ansible_timeout to 10 10896 1726882179.70329: Set connection var ansible_shell_type to sh 10896 1726882179.70336: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882179.70407: Set connection var ansible_shell_executable to /bin/sh 10896 1726882179.70411: Set connection var ansible_pipelining to False 10896 1726882179.70413: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.70416: variable 'ansible_connection' from source: unknown 10896 1726882179.70418: variable 'ansible_module_compression' from source: unknown 10896 1726882179.70421: variable 'ansible_shell_type' from source: unknown 10896 1726882179.70423: variable 'ansible_shell_executable' from source: unknown 10896 1726882179.70425: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882179.70427: variable 'ansible_pipelining' from source: unknown 10896 1726882179.70429: variable 'ansible_timeout' from source: unknown 10896 1726882179.70431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882179.70625: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882179.70630: variable 'omit' from source: magic vars 10896 1726882179.70632: starting attempt loop 10896 1726882179.70634: running the handler 10896 1726882179.70636: _low_level_execute_command(): starting 10896 1726882179.70639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882179.71335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882179.71410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.71450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.71528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.71532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.71589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.73211: stdout chunk (state=3): >>>/root <<< 10896 1726882179.73372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.73375: stdout chunk (state=3): >>><<< 10896 1726882179.73377: stderr chunk (state=3): >>><<< 10896 1726882179.73380: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882179.73382: _low_level_execute_command(): starting 10896 1726882179.73385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823 `" && echo ansible-tmp-1726882179.7335439-11994-258330969686823="` echo /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823 `" ) && sleep 0' 10896 1726882179.73781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.73820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882179.73825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882179.73836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.73840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882179.73842: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.73875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.73878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.73949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.75830: stdout chunk (state=3): >>>ansible-tmp-1726882179.7335439-11994-258330969686823=/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823 <<< 10896 1726882179.75941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.75962: stderr chunk (state=3): >>><<< 10896 1726882179.75965: stdout chunk (state=3): >>><<< 10896 1726882179.75980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882179.7335439-11994-258330969686823=/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882179.76026: variable 'ansible_module_compression' from source: unknown 10896 1726882179.76067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882179.76099: variable 'ansible_facts' from source: unknown 10896 1726882179.76163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py 10896 1726882179.76265: Sending initial data 10896 1726882179.76268: Sent initial data (153 bytes) 10896 1726882179.76684: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.76688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.76691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882179.76695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882179.76697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.76728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.76739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.76817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.78360: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882179.78424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882179.78490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpwsdn46hh /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py <<< 10896 1726882179.78496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py" <<< 10896 1726882179.78554: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpwsdn46hh" to remote "/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py" <<< 10896 1726882179.79426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.79429: stderr chunk (state=3): >>><<< 10896 1726882179.79431: stdout chunk (state=3): >>><<< 10896 1726882179.79439: done transferring module to remote 10896 1726882179.79464: _low_level_execute_command(): starting 10896 1726882179.79474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/ /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py && sleep 0' 10896 1726882179.80132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.80190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.80217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.80243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.80340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.82135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882179.82139: stdout chunk (state=3): >>><<< 10896 1726882179.82141: stderr chunk (state=3): >>><<< 10896 1726882179.82159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882179.82199: _low_level_execute_command(): starting 10896 1726882179.82203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/AnsiballZ_stat.py && sleep 0' 10896 1726882179.82775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882179.82789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882179.82808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.82852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882179.82867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882179.82956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.82975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.82992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.83097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882179.97904: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882179.99081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882179.99111: stderr chunk (state=3): >>><<< 10896 1726882179.99114: stdout chunk (state=3): >>><<< 10896 1726882179.99131: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882179.99156: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882179.99163: _low_level_execute_command(): starting 10896 1726882179.99168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882179.7335439-11994-258330969686823/ > /dev/null 2>&1 && sleep 0' 10896 1726882179.99632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882179.99636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.99638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882179.99640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882179.99642: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882179.99700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882179.99703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882179.99705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882179.99797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.01712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.01733: stderr chunk (state=3): >>><<< 10896 1726882180.01736: stdout chunk (state=3): >>><<< 10896 1726882180.01752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882180.01755: handler run complete 10896 1726882180.01772: attempt loop complete, returning result 10896 1726882180.01775: _execute() done 10896 1726882180.01778: dumping result to json 10896 1726882180.01782: done dumping result, returning 10896 1726882180.01789: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-8b02-b216-0000000003fc] 10896 1726882180.01797: sending task result for task 12673a56-9f93-8b02-b216-0000000003fc 10896 1726882180.01884: done sending task result for task 12673a56-9f93-8b02-b216-0000000003fc 10896 1726882180.01887: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 10896 1726882180.01947: no more pending results, returning what we have 10896 1726882180.01950: results queue empty 10896 1726882180.01951: checking for any_errors_fatal 10896 1726882180.01956: done checking for any_errors_fatal 10896 1726882180.01957: checking for max_fail_percentage 10896 1726882180.01959: done checking for max_fail_percentage 10896 1726882180.01960: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.01960: done checking to see if all hosts have failed 10896 1726882180.01961: getting the remaining hosts for this loop 10896 1726882180.01963: done getting the remaining hosts for this loop 10896 1726882180.01966: getting the next task for host managed_node2 10896 1726882180.01973: done getting next task for host managed_node2 10896 1726882180.01975: ^ task is: TASK: Set NM profile exist flag based on the profile files 10896 1726882180.01979: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.01983: getting variables 10896 1726882180.01984: in VariableManager get_vars() 10896 1726882180.02029: Calling all_inventory to load vars for managed_node2 10896 1726882180.02032: Calling groups_inventory to load vars for managed_node2 10896 1726882180.02034: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.02044: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.02047: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.02049: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.02851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.09030: done with get_vars() 10896 1726882180.09051: done getting variables 10896 1726882180.09101: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:40 -0400 (0:00:00.405) 0:00:21.658 ****** 10896 1726882180.09128: entering _queue_task() for managed_node2/set_fact 10896 1726882180.09858: worker is 1 (out of 1 available) 10896 1726882180.09871: exiting _queue_task() for managed_node2/set_fact 10896 1726882180.09883: done queuing things up, now waiting for results queue to drain 10896 1726882180.09885: waiting for pending results... 10896 1726882180.10380: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 10896 1726882180.10385: in run() - task 12673a56-9f93-8b02-b216-0000000003fd 10896 1726882180.10409: variable 'ansible_search_path' from source: unknown 10896 1726882180.10413: variable 'ansible_search_path' from source: unknown 10896 1726882180.10449: calling self._execute() 10896 1726882180.10553: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.10556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.10568: variable 'omit' from source: magic vars 10896 1726882180.10961: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.10987: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.11199: variable 'profile_stat' from source: set_fact 10896 1726882180.11211: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882180.11214: when evaluation is False, skipping this task 10896 1726882180.11216: _execute() done 10896 1726882180.11219: dumping result to json 10896 1726882180.11222: done dumping result, returning 10896 1726882180.11229: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-8b02-b216-0000000003fd] 10896 1726882180.11233: sending task result for task 12673a56-9f93-8b02-b216-0000000003fd 10896 1726882180.11327: done sending task result for task 12673a56-9f93-8b02-b216-0000000003fd 10896 1726882180.11331: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882180.11414: no more pending results, returning what we have 10896 1726882180.11417: results queue empty 10896 1726882180.11418: checking for any_errors_fatal 10896 1726882180.11426: done checking for any_errors_fatal 10896 1726882180.11426: checking for max_fail_percentage 10896 1726882180.11428: done checking for max_fail_percentage 10896 1726882180.11429: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.11429: done checking to see if all hosts have failed 10896 1726882180.11430: getting the remaining hosts for this loop 10896 1726882180.11432: done getting the remaining hosts for this loop 10896 1726882180.11434: getting the next task for host managed_node2 10896 1726882180.11441: done getting next task for host managed_node2 10896 1726882180.11444: ^ task is: TASK: Get NM profile info 10896 1726882180.11448: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.11453: getting variables 10896 1726882180.11454: in VariableManager get_vars() 10896 1726882180.11496: Calling all_inventory to load vars for managed_node2 10896 1726882180.11499: Calling groups_inventory to load vars for managed_node2 10896 1726882180.11501: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.11512: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.11515: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.11518: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.13357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.15288: done with get_vars() 10896 1726882180.15315: done getting variables 10896 1726882180.15373: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:40 -0400 (0:00:00.062) 0:00:21.721 ****** 10896 1726882180.15409: entering _queue_task() for managed_node2/shell 10896 1726882180.15722: worker is 1 (out of 1 available) 10896 1726882180.15733: exiting _queue_task() for managed_node2/shell 10896 1726882180.15745: done queuing things up, now waiting for results queue to drain 10896 1726882180.15747: waiting for pending results... 10896 1726882180.16226: running TaskExecutor() for managed_node2/TASK: Get NM profile info 10896 1726882180.16262: in run() - task 12673a56-9f93-8b02-b216-0000000003fe 10896 1726882180.16267: variable 'ansible_search_path' from source: unknown 10896 1726882180.16271: variable 'ansible_search_path' from source: unknown 10896 1726882180.16324: calling self._execute() 10896 1726882180.16406: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.16412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.16430: variable 'omit' from source: magic vars 10896 1726882180.16867: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.16871: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.16874: variable 'omit' from source: magic vars 10896 1726882180.16876: variable 'omit' from source: magic vars 10896 1726882180.16953: variable 'profile' from source: include params 10896 1726882180.16957: variable 'item' from source: include params 10896 1726882180.17016: variable 'item' from source: include params 10896 1726882180.17033: variable 'omit' from source: magic vars 10896 1726882180.17076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882180.17157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882180.17199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882180.17302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.17306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.17309: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882180.17311: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.17313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.17391: Set connection var ansible_connection to ssh 10896 1726882180.17414: Set connection var ansible_timeout to 10 10896 1726882180.17426: Set connection var ansible_shell_type to sh 10896 1726882180.17441: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882180.17451: Set connection var ansible_shell_executable to /bin/sh 10896 1726882180.17462: Set connection var ansible_pipelining to False 10896 1726882180.17491: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.17517: variable 'ansible_connection' from source: unknown 10896 1726882180.17521: variable 'ansible_module_compression' from source: unknown 10896 1726882180.17523: variable 'ansible_shell_type' from source: unknown 10896 1726882180.17526: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.17625: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.17628: variable 'ansible_pipelining' from source: unknown 10896 1726882180.17631: variable 'ansible_timeout' from source: unknown 10896 1726882180.17633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.17712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.17731: variable 'omit' from source: magic vars 10896 1726882180.17740: starting attempt loop 10896 1726882180.17753: running the handler 10896 1726882180.17769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.17797: _low_level_execute_command(): starting 10896 1726882180.17812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882180.18566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882180.18592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.18687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.18725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882180.18741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882180.18755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.18862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.20619: stdout chunk (state=3): >>>/root <<< 10896 1726882180.20623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.20625: stdout chunk (state=3): >>><<< 10896 1726882180.20627: stderr chunk (state=3): >>><<< 10896 1726882180.20717: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882180.20730: _low_level_execute_command(): starting 10896 1726882180.20737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661 `" && echo ansible-tmp-1726882180.2071664-12016-3033135191661="` echo /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661 `" ) && sleep 0' 10896 1726882180.21883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882180.22061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.22142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882180.22158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.22296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.24154: stdout chunk (state=3): >>>ansible-tmp-1726882180.2071664-12016-3033135191661=/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661 <<< 10896 1726882180.24311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.24326: stdout chunk (state=3): >>><<< 10896 1726882180.24344: stderr chunk (state=3): >>><<< 10896 1726882180.24371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882180.2071664-12016-3033135191661=/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882180.24413: variable 'ansible_module_compression' from source: unknown 10896 1726882180.24472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882180.24522: variable 'ansible_facts' from source: unknown 10896 1726882180.24899: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py 10896 1726882180.24923: Sending initial data 10896 1726882180.24926: Sent initial data (154 bytes) 10896 1726882180.25541: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882180.25562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.25614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882180.25670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.25711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882180.25734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882180.25750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.25848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.27378: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882180.27466: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882180.27545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmptpbqcx2r /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py <<< 10896 1726882180.27548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py" <<< 10896 1726882180.27609: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmptpbqcx2r" to remote "/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py" <<< 10896 1726882180.28555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.28559: stdout chunk (state=3): >>><<< 10896 1726882180.28561: stderr chunk (state=3): >>><<< 10896 1726882180.28563: done transferring module to remote 10896 1726882180.28565: _low_level_execute_command(): starting 10896 1726882180.28568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/ /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py && sleep 0' 10896 1726882180.29186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882180.29200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.29208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882180.29223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882180.29236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882180.29242: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882180.29252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.29266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882180.29301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882180.29305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882180.29307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.29309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882180.29317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882180.29328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882180.29331: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882180.29341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.29409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882180.29421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882180.29438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.29531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.31399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.31402: stdout chunk (state=3): >>><<< 10896 1726882180.31404: stderr chunk (state=3): >>><<< 10896 1726882180.31407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882180.31409: _low_level_execute_command(): starting 10896 1726882180.31412: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/AnsiballZ_command.py && sleep 0' 10896 1726882180.32312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.32367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882180.32574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882180.32579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.32582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.49598: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:40.474517", "end": "2024-09-20 21:29:40.494702", "delta": "0:00:00.020185", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882180.51080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882180.51084: stdout chunk (state=3): >>><<< 10896 1726882180.51099: stderr chunk (state=3): >>><<< 10896 1726882180.51118: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:40.474517", "end": "2024-09-20 21:29:40.494702", "delta": "0:00:00.020185", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882180.51156: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882180.51164: _low_level_execute_command(): starting 10896 1726882180.51178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882180.2071664-12016-3033135191661/ > /dev/null 2>&1 && sleep 0' 10896 1726882180.51838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882180.51864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.51879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882180.51910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882180.51923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882180.51930: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882180.51945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.51953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882180.51961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882180.51968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882180.51975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882180.51985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882180.52001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882180.52009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882180.52016: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882180.52026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882180.52144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882180.52147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882180.52203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882180.54041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882180.54044: stdout chunk (state=3): >>><<< 10896 1726882180.54046: stderr chunk (state=3): >>><<< 10896 1726882180.54300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882180.54304: handler run complete 10896 1726882180.54307: Evaluated conditional (False): False 10896 1726882180.54309: attempt loop complete, returning result 10896 1726882180.54311: _execute() done 10896 1726882180.54314: dumping result to json 10896 1726882180.54316: done dumping result, returning 10896 1726882180.54317: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-8b02-b216-0000000003fe] 10896 1726882180.54320: sending task result for task 12673a56-9f93-8b02-b216-0000000003fe 10896 1726882180.54400: done sending task result for task 12673a56-9f93-8b02-b216-0000000003fe 10896 1726882180.54404: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020185", "end": "2024-09-20 21:29:40.494702", "rc": 0, "start": "2024-09-20 21:29:40.474517" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 10896 1726882180.54492: no more pending results, returning what we have 10896 1726882180.54499: results queue empty 10896 1726882180.54500: checking for any_errors_fatal 10896 1726882180.54504: done checking for any_errors_fatal 10896 1726882180.54505: checking for max_fail_percentage 10896 1726882180.54507: done checking for max_fail_percentage 10896 1726882180.54508: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.54509: done checking to see if all hosts have failed 10896 1726882180.54510: getting the remaining hosts for this loop 10896 1726882180.54514: done getting the remaining hosts for this loop 10896 1726882180.54518: getting the next task for host managed_node2 10896 1726882180.54525: done getting next task for host managed_node2 10896 1726882180.54528: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882180.54531: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.54535: getting variables 10896 1726882180.54536: in VariableManager get_vars() 10896 1726882180.54578: Calling all_inventory to load vars for managed_node2 10896 1726882180.54580: Calling groups_inventory to load vars for managed_node2 10896 1726882180.54583: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.54618: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.54623: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.54626: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.56453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.57572: done with get_vars() 10896 1726882180.57596: done getting variables 10896 1726882180.57658: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:40 -0400 (0:00:00.422) 0:00:22.143 ****** 10896 1726882180.57697: entering _queue_task() for managed_node2/set_fact 10896 1726882180.58013: worker is 1 (out of 1 available) 10896 1726882180.58029: exiting _queue_task() for managed_node2/set_fact 10896 1726882180.58041: done queuing things up, now waiting for results queue to drain 10896 1726882180.58043: waiting for pending results... 10896 1726882180.58396: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882180.58521: in run() - task 12673a56-9f93-8b02-b216-0000000003ff 10896 1726882180.58526: variable 'ansible_search_path' from source: unknown 10896 1726882180.58532: variable 'ansible_search_path' from source: unknown 10896 1726882180.58535: calling self._execute() 10896 1726882180.58812: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.58815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.58818: variable 'omit' from source: magic vars 10896 1726882180.59004: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.59015: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.59139: variable 'nm_profile_exists' from source: set_fact 10896 1726882180.59155: Evaluated conditional (nm_profile_exists.rc == 0): True 10896 1726882180.59158: variable 'omit' from source: magic vars 10896 1726882180.59208: variable 'omit' from source: magic vars 10896 1726882180.59239: variable 'omit' from source: magic vars 10896 1726882180.59274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882180.59313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882180.59333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882180.59351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.59361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.59390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882180.59421: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.59425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.59503: Set connection var ansible_connection to ssh 10896 1726882180.59513: Set connection var ansible_timeout to 10 10896 1726882180.59516: Set connection var ansible_shell_type to sh 10896 1726882180.59518: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882180.59523: Set connection var ansible_shell_executable to /bin/sh 10896 1726882180.59529: Set connection var ansible_pipelining to False 10896 1726882180.59548: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.59551: variable 'ansible_connection' from source: unknown 10896 1726882180.59553: variable 'ansible_module_compression' from source: unknown 10896 1726882180.59556: variable 'ansible_shell_type' from source: unknown 10896 1726882180.59558: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.59560: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.59563: variable 'ansible_pipelining' from source: unknown 10896 1726882180.59565: variable 'ansible_timeout' from source: unknown 10896 1726882180.59570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.59670: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.59679: variable 'omit' from source: magic vars 10896 1726882180.59684: starting attempt loop 10896 1726882180.59687: running the handler 10896 1726882180.59703: handler run complete 10896 1726882180.59720: attempt loop complete, returning result 10896 1726882180.59726: _execute() done 10896 1726882180.59728: dumping result to json 10896 1726882180.59731: done dumping result, returning 10896 1726882180.59733: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-8b02-b216-0000000003ff] 10896 1726882180.59735: sending task result for task 12673a56-9f93-8b02-b216-0000000003ff 10896 1726882180.59813: done sending task result for task 12673a56-9f93-8b02-b216-0000000003ff 10896 1726882180.59816: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10896 1726882180.59863: no more pending results, returning what we have 10896 1726882180.59866: results queue empty 10896 1726882180.59867: checking for any_errors_fatal 10896 1726882180.59873: done checking for any_errors_fatal 10896 1726882180.59874: checking for max_fail_percentage 10896 1726882180.59875: done checking for max_fail_percentage 10896 1726882180.59877: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.59877: done checking to see if all hosts have failed 10896 1726882180.59878: getting the remaining hosts for this loop 10896 1726882180.59879: done getting the remaining hosts for this loop 10896 1726882180.59882: getting the next task for host managed_node2 10896 1726882180.59891: done getting next task for host managed_node2 10896 1726882180.59895: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882180.59898: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.59902: getting variables 10896 1726882180.59903: in VariableManager get_vars() 10896 1726882180.59941: Calling all_inventory to load vars for managed_node2 10896 1726882180.59944: Calling groups_inventory to load vars for managed_node2 10896 1726882180.59946: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.59955: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.59957: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.59960: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.60710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.61562: done with get_vars() 10896 1726882180.61576: done getting variables 10896 1726882180.61616: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.61700: variable 'profile' from source: include params 10896 1726882180.61703: variable 'item' from source: include params 10896 1726882180.61745: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:40 -0400 (0:00:00.040) 0:00:22.184 ****** 10896 1726882180.61770: entering _queue_task() for managed_node2/command 10896 1726882180.61977: worker is 1 (out of 1 available) 10896 1726882180.61990: exiting _queue_task() for managed_node2/command 10896 1726882180.62004: done queuing things up, now waiting for results queue to drain 10896 1726882180.62005: waiting for pending results... 10896 1726882180.62166: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 10896 1726882180.62239: in run() - task 12673a56-9f93-8b02-b216-000000000401 10896 1726882180.62250: variable 'ansible_search_path' from source: unknown 10896 1726882180.62254: variable 'ansible_search_path' from source: unknown 10896 1726882180.62279: calling self._execute() 10896 1726882180.62352: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.62355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.62365: variable 'omit' from source: magic vars 10896 1726882180.62629: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.62637: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.62721: variable 'profile_stat' from source: set_fact 10896 1726882180.62731: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882180.62735: when evaluation is False, skipping this task 10896 1726882180.62737: _execute() done 10896 1726882180.62740: dumping result to json 10896 1726882180.62743: done dumping result, returning 10896 1726882180.62748: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-8b02-b216-000000000401] 10896 1726882180.62753: sending task result for task 12673a56-9f93-8b02-b216-000000000401 10896 1726882180.62836: done sending task result for task 12673a56-9f93-8b02-b216-000000000401 10896 1726882180.62839: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882180.62920: no more pending results, returning what we have 10896 1726882180.62923: results queue empty 10896 1726882180.62924: checking for any_errors_fatal 10896 1726882180.62928: done checking for any_errors_fatal 10896 1726882180.62929: checking for max_fail_percentage 10896 1726882180.62930: done checking for max_fail_percentage 10896 1726882180.62931: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.62932: done checking to see if all hosts have failed 10896 1726882180.62932: getting the remaining hosts for this loop 10896 1726882180.62933: done getting the remaining hosts for this loop 10896 1726882180.62936: getting the next task for host managed_node2 10896 1726882180.62941: done getting next task for host managed_node2 10896 1726882180.62943: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882180.62946: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.62951: getting variables 10896 1726882180.62953: in VariableManager get_vars() 10896 1726882180.62984: Calling all_inventory to load vars for managed_node2 10896 1726882180.62987: Calling groups_inventory to load vars for managed_node2 10896 1726882180.62989: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.62998: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.63000: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.63002: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.63817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.64653: done with get_vars() 10896 1726882180.64666: done getting variables 10896 1726882180.64706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.64775: variable 'profile' from source: include params 10896 1726882180.64778: variable 'item' from source: include params 10896 1726882180.64817: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:40 -0400 (0:00:00.030) 0:00:22.215 ****** 10896 1726882180.64841: entering _queue_task() for managed_node2/set_fact 10896 1726882180.65026: worker is 1 (out of 1 available) 10896 1726882180.65038: exiting _queue_task() for managed_node2/set_fact 10896 1726882180.65049: done queuing things up, now waiting for results queue to drain 10896 1726882180.65050: waiting for pending results... 10896 1726882180.65208: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 10896 1726882180.65285: in run() - task 12673a56-9f93-8b02-b216-000000000402 10896 1726882180.65301: variable 'ansible_search_path' from source: unknown 10896 1726882180.65305: variable 'ansible_search_path' from source: unknown 10896 1726882180.65331: calling self._execute() 10896 1726882180.65404: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.65408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.65419: variable 'omit' from source: magic vars 10896 1726882180.65666: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.65674: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.65757: variable 'profile_stat' from source: set_fact 10896 1726882180.65766: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882180.65769: when evaluation is False, skipping this task 10896 1726882180.65772: _execute() done 10896 1726882180.65774: dumping result to json 10896 1726882180.65776: done dumping result, returning 10896 1726882180.65783: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-8b02-b216-000000000402] 10896 1726882180.65788: sending task result for task 12673a56-9f93-8b02-b216-000000000402 10896 1726882180.65872: done sending task result for task 12673a56-9f93-8b02-b216-000000000402 10896 1726882180.65875: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882180.65919: no more pending results, returning what we have 10896 1726882180.65922: results queue empty 10896 1726882180.65923: checking for any_errors_fatal 10896 1726882180.65928: done checking for any_errors_fatal 10896 1726882180.65929: checking for max_fail_percentage 10896 1726882180.65930: done checking for max_fail_percentage 10896 1726882180.65931: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.65932: done checking to see if all hosts have failed 10896 1726882180.65933: getting the remaining hosts for this loop 10896 1726882180.65934: done getting the remaining hosts for this loop 10896 1726882180.65937: getting the next task for host managed_node2 10896 1726882180.65943: done getting next task for host managed_node2 10896 1726882180.65945: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10896 1726882180.65949: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.65952: getting variables 10896 1726882180.65953: in VariableManager get_vars() 10896 1726882180.65984: Calling all_inventory to load vars for managed_node2 10896 1726882180.65986: Calling groups_inventory to load vars for managed_node2 10896 1726882180.65988: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.66004: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.66007: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.66010: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.66722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.67575: done with get_vars() 10896 1726882180.67589: done getting variables 10896 1726882180.67633: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.67703: variable 'profile' from source: include params 10896 1726882180.67705: variable 'item' from source: include params 10896 1726882180.67745: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:40 -0400 (0:00:00.029) 0:00:22.244 ****** 10896 1726882180.67765: entering _queue_task() for managed_node2/command 10896 1726882180.67957: worker is 1 (out of 1 available) 10896 1726882180.67971: exiting _queue_task() for managed_node2/command 10896 1726882180.67984: done queuing things up, now waiting for results queue to drain 10896 1726882180.67985: waiting for pending results... 10896 1726882180.68145: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 10896 1726882180.68219: in run() - task 12673a56-9f93-8b02-b216-000000000403 10896 1726882180.68230: variable 'ansible_search_path' from source: unknown 10896 1726882180.68233: variable 'ansible_search_path' from source: unknown 10896 1726882180.68259: calling self._execute() 10896 1726882180.68333: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.68337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.68345: variable 'omit' from source: magic vars 10896 1726882180.68596: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.68606: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.68688: variable 'profile_stat' from source: set_fact 10896 1726882180.68702: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882180.68705: when evaluation is False, skipping this task 10896 1726882180.68707: _execute() done 10896 1726882180.68710: dumping result to json 10896 1726882180.68712: done dumping result, returning 10896 1726882180.68719: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-8b02-b216-000000000403] 10896 1726882180.68723: sending task result for task 12673a56-9f93-8b02-b216-000000000403 10896 1726882180.68801: done sending task result for task 12673a56-9f93-8b02-b216-000000000403 10896 1726882180.68804: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882180.68854: no more pending results, returning what we have 10896 1726882180.68857: results queue empty 10896 1726882180.68858: checking for any_errors_fatal 10896 1726882180.68863: done checking for any_errors_fatal 10896 1726882180.68863: checking for max_fail_percentage 10896 1726882180.68865: done checking for max_fail_percentage 10896 1726882180.68866: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.68867: done checking to see if all hosts have failed 10896 1726882180.68867: getting the remaining hosts for this loop 10896 1726882180.68869: done getting the remaining hosts for this loop 10896 1726882180.68871: getting the next task for host managed_node2 10896 1726882180.68877: done getting next task for host managed_node2 10896 1726882180.68879: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10896 1726882180.68883: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.68886: getting variables 10896 1726882180.68887: in VariableManager get_vars() 10896 1726882180.68921: Calling all_inventory to load vars for managed_node2 10896 1726882180.68924: Calling groups_inventory to load vars for managed_node2 10896 1726882180.68926: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.68934: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.68936: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.68939: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.69737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.70574: done with get_vars() 10896 1726882180.70590: done getting variables 10896 1726882180.70629: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.70698: variable 'profile' from source: include params 10896 1726882180.70701: variable 'item' from source: include params 10896 1726882180.70738: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:40 -0400 (0:00:00.029) 0:00:22.274 ****** 10896 1726882180.70758: entering _queue_task() for managed_node2/set_fact 10896 1726882180.70945: worker is 1 (out of 1 available) 10896 1726882180.70958: exiting _queue_task() for managed_node2/set_fact 10896 1726882180.70969: done queuing things up, now waiting for results queue to drain 10896 1726882180.70970: waiting for pending results... 10896 1726882180.71126: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 10896 1726882180.71205: in run() - task 12673a56-9f93-8b02-b216-000000000404 10896 1726882180.71216: variable 'ansible_search_path' from source: unknown 10896 1726882180.71220: variable 'ansible_search_path' from source: unknown 10896 1726882180.71245: calling self._execute() 10896 1726882180.71316: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.71319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.71329: variable 'omit' from source: magic vars 10896 1726882180.71577: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.71585: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.71669: variable 'profile_stat' from source: set_fact 10896 1726882180.71679: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882180.71682: when evaluation is False, skipping this task 10896 1726882180.71685: _execute() done 10896 1726882180.71688: dumping result to json 10896 1726882180.71690: done dumping result, returning 10896 1726882180.71699: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-8b02-b216-000000000404] 10896 1726882180.71705: sending task result for task 12673a56-9f93-8b02-b216-000000000404 10896 1726882180.71785: done sending task result for task 12673a56-9f93-8b02-b216-000000000404 10896 1726882180.71788: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882180.71833: no more pending results, returning what we have 10896 1726882180.71836: results queue empty 10896 1726882180.71837: checking for any_errors_fatal 10896 1726882180.71842: done checking for any_errors_fatal 10896 1726882180.71843: checking for max_fail_percentage 10896 1726882180.71845: done checking for max_fail_percentage 10896 1726882180.71845: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.71846: done checking to see if all hosts have failed 10896 1726882180.71847: getting the remaining hosts for this loop 10896 1726882180.71848: done getting the remaining hosts for this loop 10896 1726882180.71851: getting the next task for host managed_node2 10896 1726882180.71857: done getting next task for host managed_node2 10896 1726882180.71859: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10896 1726882180.71861: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.71865: getting variables 10896 1726882180.71866: in VariableManager get_vars() 10896 1726882180.71899: Calling all_inventory to load vars for managed_node2 10896 1726882180.71902: Calling groups_inventory to load vars for managed_node2 10896 1726882180.71904: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.71912: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.71915: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.71917: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.72628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.73605: done with get_vars() 10896 1726882180.73622: done getting variables 10896 1726882180.73662: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.73745: variable 'profile' from source: include params 10896 1726882180.73748: variable 'item' from source: include params 10896 1726882180.73786: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:40 -0400 (0:00:00.030) 0:00:22.305 ****** 10896 1726882180.73811: entering _queue_task() for managed_node2/assert 10896 1726882180.74046: worker is 1 (out of 1 available) 10896 1726882180.74061: exiting _queue_task() for managed_node2/assert 10896 1726882180.74074: done queuing things up, now waiting for results queue to drain 10896 1726882180.74076: waiting for pending results... 10896 1726882180.74246: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' 10896 1726882180.74318: in run() - task 12673a56-9f93-8b02-b216-000000000268 10896 1726882180.74330: variable 'ansible_search_path' from source: unknown 10896 1726882180.74333: variable 'ansible_search_path' from source: unknown 10896 1726882180.74391: calling self._execute() 10896 1726882180.74457: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.74461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.74469: variable 'omit' from source: magic vars 10896 1726882180.74742: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.74749: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.74756: variable 'omit' from source: magic vars 10896 1726882180.74781: variable 'omit' from source: magic vars 10896 1726882180.74910: variable 'profile' from source: include params 10896 1726882180.74914: variable 'item' from source: include params 10896 1726882180.74941: variable 'item' from source: include params 10896 1726882180.74977: variable 'omit' from source: magic vars 10896 1726882180.74997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882180.75130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882180.75134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882180.75137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.75139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.75141: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882180.75144: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.75146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.75217: Set connection var ansible_connection to ssh 10896 1726882180.75235: Set connection var ansible_timeout to 10 10896 1726882180.75239: Set connection var ansible_shell_type to sh 10896 1726882180.75241: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882180.75244: Set connection var ansible_shell_executable to /bin/sh 10896 1726882180.75246: Set connection var ansible_pipelining to False 10896 1726882180.75275: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.75278: variable 'ansible_connection' from source: unknown 10896 1726882180.75281: variable 'ansible_module_compression' from source: unknown 10896 1726882180.75283: variable 'ansible_shell_type' from source: unknown 10896 1726882180.75286: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.75289: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.75291: variable 'ansible_pipelining' from source: unknown 10896 1726882180.75295: variable 'ansible_timeout' from source: unknown 10896 1726882180.75297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.75498: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.75502: variable 'omit' from source: magic vars 10896 1726882180.75505: starting attempt loop 10896 1726882180.75507: running the handler 10896 1726882180.75603: variable 'lsr_net_profile_exists' from source: set_fact 10896 1726882180.75607: Evaluated conditional (lsr_net_profile_exists): True 10896 1726882180.75609: handler run complete 10896 1726882180.75611: attempt loop complete, returning result 10896 1726882180.75613: _execute() done 10896 1726882180.75616: dumping result to json 10896 1726882180.75618: done dumping result, returning 10896 1726882180.75621: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' [12673a56-9f93-8b02-b216-000000000268] 10896 1726882180.75623: sending task result for task 12673a56-9f93-8b02-b216-000000000268 10896 1726882180.75952: done sending task result for task 12673a56-9f93-8b02-b216-000000000268 10896 1726882180.75955: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882180.76002: no more pending results, returning what we have 10896 1726882180.76005: results queue empty 10896 1726882180.76006: checking for any_errors_fatal 10896 1726882180.76011: done checking for any_errors_fatal 10896 1726882180.76012: checking for max_fail_percentage 10896 1726882180.76014: done checking for max_fail_percentage 10896 1726882180.76015: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.76015: done checking to see if all hosts have failed 10896 1726882180.76016: getting the remaining hosts for this loop 10896 1726882180.76017: done getting the remaining hosts for this loop 10896 1726882180.76021: getting the next task for host managed_node2 10896 1726882180.76026: done getting next task for host managed_node2 10896 1726882180.76029: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10896 1726882180.76032: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.76037: getting variables 10896 1726882180.76038: in VariableManager get_vars() 10896 1726882180.76076: Calling all_inventory to load vars for managed_node2 10896 1726882180.76082: Calling groups_inventory to load vars for managed_node2 10896 1726882180.76085: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.76097: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.76100: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.76104: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.77440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.78299: done with get_vars() 10896 1726882180.78313: done getting variables 10896 1726882180.78352: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.78432: variable 'profile' from source: include params 10896 1726882180.78435: variable 'item' from source: include params 10896 1726882180.78473: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:40 -0400 (0:00:00.046) 0:00:22.351 ****** 10896 1726882180.78502: entering _queue_task() for managed_node2/assert 10896 1726882180.78765: worker is 1 (out of 1 available) 10896 1726882180.78777: exiting _queue_task() for managed_node2/assert 10896 1726882180.78788: done queuing things up, now waiting for results queue to drain 10896 1726882180.78789: waiting for pending results... 10896 1726882180.79251: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 10896 1726882180.79255: in run() - task 12673a56-9f93-8b02-b216-000000000269 10896 1726882180.79259: variable 'ansible_search_path' from source: unknown 10896 1726882180.79262: variable 'ansible_search_path' from source: unknown 10896 1726882180.79264: calling self._execute() 10896 1726882180.79418: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.79423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.79426: variable 'omit' from source: magic vars 10896 1726882180.79674: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.79679: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.79682: variable 'omit' from source: magic vars 10896 1726882180.79722: variable 'omit' from source: magic vars 10896 1726882180.79943: variable 'profile' from source: include params 10896 1726882180.79946: variable 'item' from source: include params 10896 1726882180.79949: variable 'item' from source: include params 10896 1726882180.79952: variable 'omit' from source: magic vars 10896 1726882180.79954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882180.80302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882180.80306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882180.80310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.80313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.80316: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882180.80319: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.80322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.80324: Set connection var ansible_connection to ssh 10896 1726882180.80327: Set connection var ansible_timeout to 10 10896 1726882180.80330: Set connection var ansible_shell_type to sh 10896 1726882180.80332: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882180.80335: Set connection var ansible_shell_executable to /bin/sh 10896 1726882180.80339: Set connection var ansible_pipelining to False 10896 1726882180.80341: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.80344: variable 'ansible_connection' from source: unknown 10896 1726882180.80347: variable 'ansible_module_compression' from source: unknown 10896 1726882180.80349: variable 'ansible_shell_type' from source: unknown 10896 1726882180.80352: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.80355: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.80357: variable 'ansible_pipelining' from source: unknown 10896 1726882180.80361: variable 'ansible_timeout' from source: unknown 10896 1726882180.80363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.80617: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.80620: variable 'omit' from source: magic vars 10896 1726882180.80622: starting attempt loop 10896 1726882180.80624: running the handler 10896 1726882180.80626: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10896 1726882180.80628: Evaluated conditional (lsr_net_profile_ansible_managed): True 10896 1726882180.80629: handler run complete 10896 1726882180.80631: attempt loop complete, returning result 10896 1726882180.80632: _execute() done 10896 1726882180.80634: dumping result to json 10896 1726882180.80636: done dumping result, returning 10896 1726882180.80637: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12673a56-9f93-8b02-b216-000000000269] 10896 1726882180.80639: sending task result for task 12673a56-9f93-8b02-b216-000000000269 10896 1726882180.80695: done sending task result for task 12673a56-9f93-8b02-b216-000000000269 10896 1726882180.80699: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882180.80743: no more pending results, returning what we have 10896 1726882180.80746: results queue empty 10896 1726882180.80746: checking for any_errors_fatal 10896 1726882180.80752: done checking for any_errors_fatal 10896 1726882180.80753: checking for max_fail_percentage 10896 1726882180.80755: done checking for max_fail_percentage 10896 1726882180.80756: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.80757: done checking to see if all hosts have failed 10896 1726882180.80758: getting the remaining hosts for this loop 10896 1726882180.80759: done getting the remaining hosts for this loop 10896 1726882180.80762: getting the next task for host managed_node2 10896 1726882180.80767: done getting next task for host managed_node2 10896 1726882180.80769: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10896 1726882180.80772: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.80776: getting variables 10896 1726882180.80778: in VariableManager get_vars() 10896 1726882180.80822: Calling all_inventory to load vars for managed_node2 10896 1726882180.80826: Calling groups_inventory to load vars for managed_node2 10896 1726882180.80828: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.80838: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.80840: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.80843: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.82982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.86613: done with get_vars() 10896 1726882180.86644: done getting variables 10896 1726882180.86710: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882180.86920: variable 'profile' from source: include params 10896 1726882180.86924: variable 'item' from source: include params 10896 1726882180.87001: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:40 -0400 (0:00:00.085) 0:00:22.437 ****** 10896 1726882180.87038: entering _queue_task() for managed_node2/assert 10896 1726882180.87350: worker is 1 (out of 1 available) 10896 1726882180.87362: exiting _queue_task() for managed_node2/assert 10896 1726882180.87373: done queuing things up, now waiting for results queue to drain 10896 1726882180.87375: waiting for pending results... 10896 1726882180.88110: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 10896 1726882180.88116: in run() - task 12673a56-9f93-8b02-b216-00000000026a 10896 1726882180.88119: variable 'ansible_search_path' from source: unknown 10896 1726882180.88122: variable 'ansible_search_path' from source: unknown 10896 1726882180.88125: calling self._execute() 10896 1726882180.88128: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.88130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.88133: variable 'omit' from source: magic vars 10896 1726882180.88275: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.88300: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.88304: variable 'omit' from source: magic vars 10896 1726882180.88421: variable 'omit' from source: magic vars 10896 1726882180.88425: variable 'profile' from source: include params 10896 1726882180.88433: variable 'item' from source: include params 10896 1726882180.88491: variable 'item' from source: include params 10896 1726882180.88512: variable 'omit' from source: magic vars 10896 1726882180.88548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882180.88588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882180.88610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882180.88628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.88637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882180.88666: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882180.88670: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.88673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.88899: Set connection var ansible_connection to ssh 10896 1726882180.88902: Set connection var ansible_timeout to 10 10896 1726882180.88904: Set connection var ansible_shell_type to sh 10896 1726882180.88907: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882180.88940: Set connection var ansible_shell_executable to /bin/sh 10896 1726882180.88952: Set connection var ansible_pipelining to False 10896 1726882180.88980: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.88988: variable 'ansible_connection' from source: unknown 10896 1726882180.88998: variable 'ansible_module_compression' from source: unknown 10896 1726882180.89301: variable 'ansible_shell_type' from source: unknown 10896 1726882180.89304: variable 'ansible_shell_executable' from source: unknown 10896 1726882180.89307: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.89309: variable 'ansible_pipelining' from source: unknown 10896 1726882180.89312: variable 'ansible_timeout' from source: unknown 10896 1726882180.89315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.89383: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882180.89405: variable 'omit' from source: magic vars 10896 1726882180.89417: starting attempt loop 10896 1726882180.89423: running the handler 10896 1726882180.89648: variable 'lsr_net_profile_fingerprint' from source: set_fact 10896 1726882180.89659: Evaluated conditional (lsr_net_profile_fingerprint): True 10896 1726882180.89719: handler run complete 10896 1726882180.89738: attempt loop complete, returning result 10896 1726882180.89746: _execute() done 10896 1726882180.89753: dumping result to json 10896 1726882180.89761: done dumping result, returning 10896 1726882180.89774: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 [12673a56-9f93-8b02-b216-00000000026a] 10896 1726882180.89796: sending task result for task 12673a56-9f93-8b02-b216-00000000026a ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882180.89962: no more pending results, returning what we have 10896 1726882180.89965: results queue empty 10896 1726882180.89966: checking for any_errors_fatal 10896 1726882180.89975: done checking for any_errors_fatal 10896 1726882180.89976: checking for max_fail_percentage 10896 1726882180.89978: done checking for max_fail_percentage 10896 1726882180.89979: checking to see if all hosts have failed and the running result is not ok 10896 1726882180.89980: done checking to see if all hosts have failed 10896 1726882180.89981: getting the remaining hosts for this loop 10896 1726882180.89982: done getting the remaining hosts for this loop 10896 1726882180.89985: getting the next task for host managed_node2 10896 1726882180.90000: done getting next task for host managed_node2 10896 1726882180.90003: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10896 1726882180.90005: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882180.90009: getting variables 10896 1726882180.90011: in VariableManager get_vars() 10896 1726882180.90056: Calling all_inventory to load vars for managed_node2 10896 1726882180.90059: Calling groups_inventory to load vars for managed_node2 10896 1726882180.90061: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.90072: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.90074: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.90078: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.90802: done sending task result for task 12673a56-9f93-8b02-b216-00000000026a 10896 1726882180.90806: WORKER PROCESS EXITING 10896 1726882180.92026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882180.93991: done with get_vars() 10896 1726882180.94015: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.070) 0:00:22.510 ****** 10896 1726882180.94403: entering _queue_task() for managed_node2/include_tasks 10896 1726882180.94842: worker is 1 (out of 1 available) 10896 1726882180.94855: exiting _queue_task() for managed_node2/include_tasks 10896 1726882180.94866: done queuing things up, now waiting for results queue to drain 10896 1726882180.94868: waiting for pending results... 10896 1726882180.95335: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 10896 1726882180.95549: in run() - task 12673a56-9f93-8b02-b216-00000000026e 10896 1726882180.95562: variable 'ansible_search_path' from source: unknown 10896 1726882180.95566: variable 'ansible_search_path' from source: unknown 10896 1726882180.95758: calling self._execute() 10896 1726882180.95904: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882180.95911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882180.95920: variable 'omit' from source: magic vars 10896 1726882180.96779: variable 'ansible_distribution_major_version' from source: facts 10896 1726882180.96790: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882180.96948: _execute() done 10896 1726882180.96951: dumping result to json 10896 1726882180.96956: done dumping result, returning 10896 1726882180.96963: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-8b02-b216-00000000026e] 10896 1726882180.96969: sending task result for task 12673a56-9f93-8b02-b216-00000000026e 10896 1726882180.97220: done sending task result for task 12673a56-9f93-8b02-b216-00000000026e 10896 1726882180.97223: WORKER PROCESS EXITING 10896 1726882180.97249: no more pending results, returning what we have 10896 1726882180.97253: in VariableManager get_vars() 10896 1726882180.97308: Calling all_inventory to load vars for managed_node2 10896 1726882180.97312: Calling groups_inventory to load vars for managed_node2 10896 1726882180.97315: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882180.97329: Calling all_plugins_play to load vars for managed_node2 10896 1726882180.97333: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882180.97337: Calling groups_plugins_play to load vars for managed_node2 10896 1726882180.99712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.01400: done with get_vars() 10896 1726882181.01429: variable 'ansible_search_path' from source: unknown 10896 1726882181.01431: variable 'ansible_search_path' from source: unknown 10896 1726882181.01467: we have included files to process 10896 1726882181.01468: generating all_blocks data 10896 1726882181.01470: done generating all_blocks data 10896 1726882181.01476: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882181.01477: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882181.01479: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10896 1726882181.02437: done processing included file 10896 1726882181.02438: iterating over new_blocks loaded from include file 10896 1726882181.02440: in VariableManager get_vars() 10896 1726882181.02460: done with get_vars() 10896 1726882181.02461: filtering new block on tags 10896 1726882181.02482: done filtering new block on tags 10896 1726882181.02484: in VariableManager get_vars() 10896 1726882181.02505: done with get_vars() 10896 1726882181.02508: filtering new block on tags 10896 1726882181.02551: done filtering new block on tags 10896 1726882181.02554: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 10896 1726882181.02564: extending task lists for all hosts with included blocks 10896 1726882181.02770: done extending task lists 10896 1726882181.02772: done processing included files 10896 1726882181.02773: results queue empty 10896 1726882181.02773: checking for any_errors_fatal 10896 1726882181.02777: done checking for any_errors_fatal 10896 1726882181.02777: checking for max_fail_percentage 10896 1726882181.02778: done checking for max_fail_percentage 10896 1726882181.02779: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.02780: done checking to see if all hosts have failed 10896 1726882181.02781: getting the remaining hosts for this loop 10896 1726882181.02782: done getting the remaining hosts for this loop 10896 1726882181.02784: getting the next task for host managed_node2 10896 1726882181.02788: done getting next task for host managed_node2 10896 1726882181.02790: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882181.02796: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.02799: getting variables 10896 1726882181.02800: in VariableManager get_vars() 10896 1726882181.02815: Calling all_inventory to load vars for managed_node2 10896 1726882181.02817: Calling groups_inventory to load vars for managed_node2 10896 1726882181.02820: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.02826: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.02828: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.02831: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.04072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.05882: done with get_vars() 10896 1726882181.05915: done getting variables 10896 1726882181.05957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:41 -0400 (0:00:00.115) 0:00:22.626 ****** 10896 1726882181.05987: entering _queue_task() for managed_node2/set_fact 10896 1726882181.06365: worker is 1 (out of 1 available) 10896 1726882181.06376: exiting _queue_task() for managed_node2/set_fact 10896 1726882181.06391: done queuing things up, now waiting for results queue to drain 10896 1726882181.06395: waiting for pending results... 10896 1726882181.06587: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10896 1726882181.06669: in run() - task 12673a56-9f93-8b02-b216-000000000443 10896 1726882181.06679: variable 'ansible_search_path' from source: unknown 10896 1726882181.06682: variable 'ansible_search_path' from source: unknown 10896 1726882181.06714: calling self._execute() 10896 1726882181.06788: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.06791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.06805: variable 'omit' from source: magic vars 10896 1726882181.07079: variable 'ansible_distribution_major_version' from source: facts 10896 1726882181.07089: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882181.07096: variable 'omit' from source: magic vars 10896 1726882181.07129: variable 'omit' from source: magic vars 10896 1726882181.07152: variable 'omit' from source: magic vars 10896 1726882181.07184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882181.07217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882181.07234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882181.07269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.07272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.07296: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882181.07301: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.07305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.07420: Set connection var ansible_connection to ssh 10896 1726882181.07423: Set connection var ansible_timeout to 10 10896 1726882181.07428: Set connection var ansible_shell_type to sh 10896 1726882181.07431: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882181.07433: Set connection var ansible_shell_executable to /bin/sh 10896 1726882181.07435: Set connection var ansible_pipelining to False 10896 1726882181.07495: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.07499: variable 'ansible_connection' from source: unknown 10896 1726882181.07502: variable 'ansible_module_compression' from source: unknown 10896 1726882181.07504: variable 'ansible_shell_type' from source: unknown 10896 1726882181.07506: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.07508: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.07510: variable 'ansible_pipelining' from source: unknown 10896 1726882181.07512: variable 'ansible_timeout' from source: unknown 10896 1726882181.07514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.07633: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882181.07637: variable 'omit' from source: magic vars 10896 1726882181.07640: starting attempt loop 10896 1726882181.07642: running the handler 10896 1726882181.07645: handler run complete 10896 1726882181.07661: attempt loop complete, returning result 10896 1726882181.07663: _execute() done 10896 1726882181.07665: dumping result to json 10896 1726882181.07673: done dumping result, returning 10896 1726882181.07676: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-8b02-b216-000000000443] 10896 1726882181.07678: sending task result for task 12673a56-9f93-8b02-b216-000000000443 10896 1726882181.07754: done sending task result for task 12673a56-9f93-8b02-b216-000000000443 10896 1726882181.07756: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10896 1726882181.07841: no more pending results, returning what we have 10896 1726882181.07844: results queue empty 10896 1726882181.07844: checking for any_errors_fatal 10896 1726882181.07846: done checking for any_errors_fatal 10896 1726882181.07846: checking for max_fail_percentage 10896 1726882181.07848: done checking for max_fail_percentage 10896 1726882181.07849: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.07850: done checking to see if all hosts have failed 10896 1726882181.07850: getting the remaining hosts for this loop 10896 1726882181.07851: done getting the remaining hosts for this loop 10896 1726882181.07855: getting the next task for host managed_node2 10896 1726882181.07862: done getting next task for host managed_node2 10896 1726882181.07864: ^ task is: TASK: Stat profile file 10896 1726882181.07868: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.07872: getting variables 10896 1726882181.07873: in VariableManager get_vars() 10896 1726882181.07916: Calling all_inventory to load vars for managed_node2 10896 1726882181.07919: Calling groups_inventory to load vars for managed_node2 10896 1726882181.07921: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.07930: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.07933: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.07935: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.09400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.10763: done with get_vars() 10896 1726882181.10781: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:41 -0400 (0:00:00.048) 0:00:22.675 ****** 10896 1726882181.10873: entering _queue_task() for managed_node2/stat 10896 1726882181.11191: worker is 1 (out of 1 available) 10896 1726882181.11205: exiting _queue_task() for managed_node2/stat 10896 1726882181.11217: done queuing things up, now waiting for results queue to drain 10896 1726882181.11219: waiting for pending results... 10896 1726882181.11615: running TaskExecutor() for managed_node2/TASK: Stat profile file 10896 1726882181.11622: in run() - task 12673a56-9f93-8b02-b216-000000000444 10896 1726882181.11643: variable 'ansible_search_path' from source: unknown 10896 1726882181.11652: variable 'ansible_search_path' from source: unknown 10896 1726882181.11696: calling self._execute() 10896 1726882181.11808: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.11827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.11842: variable 'omit' from source: magic vars 10896 1726882181.12244: variable 'ansible_distribution_major_version' from source: facts 10896 1726882181.12265: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882181.12275: variable 'omit' from source: magic vars 10896 1726882181.12321: variable 'omit' from source: magic vars 10896 1726882181.12427: variable 'profile' from source: include params 10896 1726882181.12438: variable 'item' from source: include params 10896 1726882181.12509: variable 'item' from source: include params 10896 1726882181.12534: variable 'omit' from source: magic vars 10896 1726882181.12690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882181.12696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882181.12699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882181.12701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.12703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.12725: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882181.12734: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.12742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.12849: Set connection var ansible_connection to ssh 10896 1726882181.12863: Set connection var ansible_timeout to 10 10896 1726882181.12870: Set connection var ansible_shell_type to sh 10896 1726882181.12883: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882181.12891: Set connection var ansible_shell_executable to /bin/sh 10896 1726882181.12908: Set connection var ansible_pipelining to False 10896 1726882181.12936: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.12944: variable 'ansible_connection' from source: unknown 10896 1726882181.12950: variable 'ansible_module_compression' from source: unknown 10896 1726882181.12957: variable 'ansible_shell_type' from source: unknown 10896 1726882181.12963: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.12968: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.12974: variable 'ansible_pipelining' from source: unknown 10896 1726882181.12980: variable 'ansible_timeout' from source: unknown 10896 1726882181.12987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.13199: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882181.13217: variable 'omit' from source: magic vars 10896 1726882181.13339: starting attempt loop 10896 1726882181.13342: running the handler 10896 1726882181.13345: _low_level_execute_command(): starting 10896 1726882181.13347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882181.14009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.14069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.14095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.14121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.14219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.15855: stdout chunk (state=3): >>>/root <<< 10896 1726882181.16019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.16023: stdout chunk (state=3): >>><<< 10896 1726882181.16025: stderr chunk (state=3): >>><<< 10896 1726882181.16048: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.16068: _low_level_execute_command(): starting 10896 1726882181.16155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319 `" && echo ansible-tmp-1726882181.160553-12084-162161020868319="` echo /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319 `" ) && sleep 0' 10896 1726882181.16710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.16730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.16748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.16767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882181.16809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.16898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.16924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.17029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.18899: stdout chunk (state=3): >>>ansible-tmp-1726882181.160553-12084-162161020868319=/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319 <<< 10896 1726882181.19042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.19072: stderr chunk (state=3): >>><<< 10896 1726882181.19075: stdout chunk (state=3): >>><<< 10896 1726882181.19094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.160553-12084-162161020868319=/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.19162: variable 'ansible_module_compression' from source: unknown 10896 1726882181.19212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10896 1726882181.19268: variable 'ansible_facts' from source: unknown 10896 1726882181.19443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py 10896 1726882181.19517: Sending initial data 10896 1726882181.19527: Sent initial data (152 bytes) 10896 1726882181.20114: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.20217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.20224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.20268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.20272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.20333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.21858: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10896 1726882181.21879: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10896 1726882181.21891: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 10896 1726882181.21926: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882181.22006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882181.22072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpfchp7hrk /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py <<< 10896 1726882181.22076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py" <<< 10896 1726882181.22129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpfchp7hrk" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py" <<< 10896 1726882181.22961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.23060: stderr chunk (state=3): >>><<< 10896 1726882181.23063: stdout chunk (state=3): >>><<< 10896 1726882181.23065: done transferring module to remote 10896 1726882181.23067: _low_level_execute_command(): starting 10896 1726882181.23069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/ /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py && sleep 0' 10896 1726882181.23682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.23724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.23742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882181.23837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.23853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.23872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.23965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.25691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.25723: stderr chunk (state=3): >>><<< 10896 1726882181.25725: stdout chunk (state=3): >>><<< 10896 1726882181.25735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.25778: _low_level_execute_command(): starting 10896 1726882181.25782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/AnsiballZ_stat.py && sleep 0' 10896 1726882181.26149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.26152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882181.26155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.26157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.26159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.26206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.26211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.26277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.41192: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10896 1726882181.42600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882181.42604: stdout chunk (state=3): >>><<< 10896 1726882181.42607: stderr chunk (state=3): >>><<< 10896 1726882181.42610: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882181.42613: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882181.42615: _low_level_execute_command(): starting 10896 1726882181.42617: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.160553-12084-162161020868319/ > /dev/null 2>&1 && sleep 0' 10896 1726882181.43167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.43176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.43187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.43207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882181.43221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882181.43229: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882181.43272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.43338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.43351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.43381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.43464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.45311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.45330: stderr chunk (state=3): >>><<< 10896 1726882181.45343: stdout chunk (state=3): >>><<< 10896 1726882181.45599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.45603: handler run complete 10896 1726882181.45605: attempt loop complete, returning result 10896 1726882181.45607: _execute() done 10896 1726882181.45609: dumping result to json 10896 1726882181.45611: done dumping result, returning 10896 1726882181.45613: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-8b02-b216-000000000444] 10896 1726882181.45615: sending task result for task 12673a56-9f93-8b02-b216-000000000444 10896 1726882181.45685: done sending task result for task 12673a56-9f93-8b02-b216-000000000444 10896 1726882181.45689: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 10896 1726882181.45753: no more pending results, returning what we have 10896 1726882181.45756: results queue empty 10896 1726882181.45757: checking for any_errors_fatal 10896 1726882181.45764: done checking for any_errors_fatal 10896 1726882181.45765: checking for max_fail_percentage 10896 1726882181.45766: done checking for max_fail_percentage 10896 1726882181.45767: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.45768: done checking to see if all hosts have failed 10896 1726882181.45769: getting the remaining hosts for this loop 10896 1726882181.45771: done getting the remaining hosts for this loop 10896 1726882181.45775: getting the next task for host managed_node2 10896 1726882181.45782: done getting next task for host managed_node2 10896 1726882181.45784: ^ task is: TASK: Set NM profile exist flag based on the profile files 10896 1726882181.45788: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.45795: getting variables 10896 1726882181.45797: in VariableManager get_vars() 10896 1726882181.45842: Calling all_inventory to load vars for managed_node2 10896 1726882181.45845: Calling groups_inventory to load vars for managed_node2 10896 1726882181.45847: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.45858: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.45862: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.45865: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.47561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.49383: done with get_vars() 10896 1726882181.49418: done getting variables 10896 1726882181.49480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:41 -0400 (0:00:00.386) 0:00:23.062 ****** 10896 1726882181.49519: entering _queue_task() for managed_node2/set_fact 10896 1726882181.49931: worker is 1 (out of 1 available) 10896 1726882181.49944: exiting _queue_task() for managed_node2/set_fact 10896 1726882181.49955: done queuing things up, now waiting for results queue to drain 10896 1726882181.49956: waiting for pending results... 10896 1726882181.50223: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 10896 1726882181.50373: in run() - task 12673a56-9f93-8b02-b216-000000000445 10896 1726882181.50378: variable 'ansible_search_path' from source: unknown 10896 1726882181.50381: variable 'ansible_search_path' from source: unknown 10896 1726882181.50391: calling self._execute() 10896 1726882181.50504: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.50517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.50535: variable 'omit' from source: magic vars 10896 1726882181.51025: variable 'ansible_distribution_major_version' from source: facts 10896 1726882181.51028: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882181.51063: variable 'profile_stat' from source: set_fact 10896 1726882181.51078: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882181.51088: when evaluation is False, skipping this task 10896 1726882181.51096: _execute() done 10896 1726882181.51104: dumping result to json 10896 1726882181.51109: done dumping result, returning 10896 1726882181.51118: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-8b02-b216-000000000445] 10896 1726882181.51126: sending task result for task 12673a56-9f93-8b02-b216-000000000445 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882181.51353: no more pending results, returning what we have 10896 1726882181.51357: results queue empty 10896 1726882181.51359: checking for any_errors_fatal 10896 1726882181.51368: done checking for any_errors_fatal 10896 1726882181.51369: checking for max_fail_percentage 10896 1726882181.51371: done checking for max_fail_percentage 10896 1726882181.51372: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.51373: done checking to see if all hosts have failed 10896 1726882181.51374: getting the remaining hosts for this loop 10896 1726882181.51376: done getting the remaining hosts for this loop 10896 1726882181.51379: getting the next task for host managed_node2 10896 1726882181.51387: done getting next task for host managed_node2 10896 1726882181.51390: ^ task is: TASK: Get NM profile info 10896 1726882181.51397: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.51402: getting variables 10896 1726882181.51404: in VariableManager get_vars() 10896 1726882181.51449: Calling all_inventory to load vars for managed_node2 10896 1726882181.51452: Calling groups_inventory to load vars for managed_node2 10896 1726882181.51457: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.51578: done sending task result for task 12673a56-9f93-8b02-b216-000000000445 10896 1726882181.51581: WORKER PROCESS EXITING 10896 1726882181.51594: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.51598: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.51602: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.53125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.54688: done with get_vars() 10896 1726882181.54721: done getting variables 10896 1726882181.54785: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:41 -0400 (0:00:00.052) 0:00:23.115 ****** 10896 1726882181.54823: entering _queue_task() for managed_node2/shell 10896 1726882181.55195: worker is 1 (out of 1 available) 10896 1726882181.55210: exiting _queue_task() for managed_node2/shell 10896 1726882181.55223: done queuing things up, now waiting for results queue to drain 10896 1726882181.55225: waiting for pending results... 10896 1726882181.55525: running TaskExecutor() for managed_node2/TASK: Get NM profile info 10896 1726882181.55664: in run() - task 12673a56-9f93-8b02-b216-000000000446 10896 1726882181.55689: variable 'ansible_search_path' from source: unknown 10896 1726882181.55700: variable 'ansible_search_path' from source: unknown 10896 1726882181.55753: calling self._execute() 10896 1726882181.55869: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.55881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.55897: variable 'omit' from source: magic vars 10896 1726882181.56306: variable 'ansible_distribution_major_version' from source: facts 10896 1726882181.56325: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882181.56337: variable 'omit' from source: magic vars 10896 1726882181.56396: variable 'omit' from source: magic vars 10896 1726882181.56510: variable 'profile' from source: include params 10896 1726882181.56521: variable 'item' from source: include params 10896 1726882181.56595: variable 'item' from source: include params 10896 1726882181.56620: variable 'omit' from source: magic vars 10896 1726882181.56702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882181.56714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882181.56740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882181.56761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.56778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.56822: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882181.56998: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.57001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.57003: Set connection var ansible_connection to ssh 10896 1726882181.57006: Set connection var ansible_timeout to 10 10896 1726882181.57008: Set connection var ansible_shell_type to sh 10896 1726882181.57010: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882181.57012: Set connection var ansible_shell_executable to /bin/sh 10896 1726882181.57014: Set connection var ansible_pipelining to False 10896 1726882181.57016: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.57018: variable 'ansible_connection' from source: unknown 10896 1726882181.57020: variable 'ansible_module_compression' from source: unknown 10896 1726882181.57022: variable 'ansible_shell_type' from source: unknown 10896 1726882181.57024: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.57036: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.57044: variable 'ansible_pipelining' from source: unknown 10896 1726882181.57051: variable 'ansible_timeout' from source: unknown 10896 1726882181.57060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.57216: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882181.57237: variable 'omit' from source: magic vars 10896 1726882181.57255: starting attempt loop 10896 1726882181.57264: running the handler 10896 1726882181.57280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882181.57306: _low_level_execute_command(): starting 10896 1726882181.57321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882181.58120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.58124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.58127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.58131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882181.58134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882181.58136: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882181.58163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.58242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.58246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.58262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.58315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.58392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.60005: stdout chunk (state=3): >>>/root <<< 10896 1726882181.60258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.60261: stdout chunk (state=3): >>><<< 10896 1726882181.60264: stderr chunk (state=3): >>><<< 10896 1726882181.60267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.60269: _low_level_execute_command(): starting 10896 1726882181.60272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757 `" && echo ansible-tmp-1726882181.6017349-12113-245858708264757="` echo /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757 `" ) && sleep 0' 10896 1726882181.60870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.60874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.60876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882181.60878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.60881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.60883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882181.60886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.60977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.60996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.61090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.62961: stdout chunk (state=3): >>>ansible-tmp-1726882181.6017349-12113-245858708264757=/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757 <<< 10896 1726882181.63100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.63201: stderr chunk (state=3): >>><<< 10896 1726882181.63205: stdout chunk (state=3): >>><<< 10896 1726882181.63208: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.6017349-12113-245858708264757=/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.63211: variable 'ansible_module_compression' from source: unknown 10896 1726882181.63213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882181.63249: variable 'ansible_facts' from source: unknown 10896 1726882181.63352: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py 10896 1726882181.63554: Sending initial data 10896 1726882181.63557: Sent initial data (156 bytes) 10896 1726882181.64119: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.64128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.64139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882181.64154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882181.64167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882181.64174: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882181.64184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.64279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.64335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.64400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.65935: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882181.66004: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882181.66073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp6bsrsvjk /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py <<< 10896 1726882181.66095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py" <<< 10896 1726882181.66142: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 10896 1726882181.66156: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp6bsrsvjk" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py" <<< 10896 1726882181.67067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.67070: stdout chunk (state=3): >>><<< 10896 1726882181.67072: stderr chunk (state=3): >>><<< 10896 1726882181.67082: done transferring module to remote 10896 1726882181.67099: _low_level_execute_command(): starting 10896 1726882181.67107: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/ /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py && sleep 0' 10896 1726882181.67729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.67758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.67864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882181.67878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.67904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.68001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.69770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.69784: stdout chunk (state=3): >>><<< 10896 1726882181.69798: stderr chunk (state=3): >>><<< 10896 1726882181.69820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.69829: _low_level_execute_command(): starting 10896 1726882181.69838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/AnsiballZ_command.py && sleep 0' 10896 1726882181.70634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.70644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.70673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.70781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.87745: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:41.855880", "end": "2024-09-20 21:29:41.876181", "delta": "0:00:00.020301", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882181.89241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882181.89245: stdout chunk (state=3): >>><<< 10896 1726882181.89247: stderr chunk (state=3): >>><<< 10896 1726882181.89401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:41.855880", "end": "2024-09-20 21:29:41.876181", "delta": "0:00:00.020301", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882181.89405: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882181.89412: _low_level_execute_command(): starting 10896 1726882181.89415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.6017349-12113-245858708264757/ > /dev/null 2>&1 && sleep 0' 10896 1726882181.90033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882181.90046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882181.90088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882181.90108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882181.90208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882181.90235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882181.90327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882181.92105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882181.92137: stderr chunk (state=3): >>><<< 10896 1726882181.92140: stdout chunk (state=3): >>><<< 10896 1726882181.92146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882181.92152: handler run complete 10896 1726882181.92171: Evaluated conditional (False): False 10896 1726882181.92180: attempt loop complete, returning result 10896 1726882181.92183: _execute() done 10896 1726882181.92185: dumping result to json 10896 1726882181.92190: done dumping result, returning 10896 1726882181.92203: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-8b02-b216-000000000446] 10896 1726882181.92208: sending task result for task 12673a56-9f93-8b02-b216-000000000446 10896 1726882181.92301: done sending task result for task 12673a56-9f93-8b02-b216-000000000446 10896 1726882181.92304: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020301", "end": "2024-09-20 21:29:41.876181", "rc": 0, "start": "2024-09-20 21:29:41.855880" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10896 1726882181.92367: no more pending results, returning what we have 10896 1726882181.92370: results queue empty 10896 1726882181.92371: checking for any_errors_fatal 10896 1726882181.92377: done checking for any_errors_fatal 10896 1726882181.92378: checking for max_fail_percentage 10896 1726882181.92379: done checking for max_fail_percentage 10896 1726882181.92380: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.92381: done checking to see if all hosts have failed 10896 1726882181.92382: getting the remaining hosts for this loop 10896 1726882181.92383: done getting the remaining hosts for this loop 10896 1726882181.92386: getting the next task for host managed_node2 10896 1726882181.92396: done getting next task for host managed_node2 10896 1726882181.92404: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882181.92409: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.92413: getting variables 10896 1726882181.92414: in VariableManager get_vars() 10896 1726882181.92458: Calling all_inventory to load vars for managed_node2 10896 1726882181.92460: Calling groups_inventory to load vars for managed_node2 10896 1726882181.92462: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.92472: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.92475: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.92477: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.93851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.94701: done with get_vars() 10896 1726882181.94717: done getting variables 10896 1726882181.94760: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:41 -0400 (0:00:00.399) 0:00:23.514 ****** 10896 1726882181.94784: entering _queue_task() for managed_node2/set_fact 10896 1726882181.95030: worker is 1 (out of 1 available) 10896 1726882181.95044: exiting _queue_task() for managed_node2/set_fact 10896 1726882181.95057: done queuing things up, now waiting for results queue to drain 10896 1726882181.95059: waiting for pending results... 10896 1726882181.95228: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10896 1726882181.95304: in run() - task 12673a56-9f93-8b02-b216-000000000447 10896 1726882181.95316: variable 'ansible_search_path' from source: unknown 10896 1726882181.95320: variable 'ansible_search_path' from source: unknown 10896 1726882181.95347: calling self._execute() 10896 1726882181.95429: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.95433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.95441: variable 'omit' from source: magic vars 10896 1726882181.95728: variable 'ansible_distribution_major_version' from source: facts 10896 1726882181.95762: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882181.95908: variable 'nm_profile_exists' from source: set_fact 10896 1726882181.95912: Evaluated conditional (nm_profile_exists.rc == 0): True 10896 1726882181.95914: variable 'omit' from source: magic vars 10896 1726882181.95938: variable 'omit' from source: magic vars 10896 1726882181.95968: variable 'omit' from source: magic vars 10896 1726882181.96011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882181.96261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882181.96264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882181.96267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.96270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882181.96272: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882181.96275: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.96277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.96279: Set connection var ansible_connection to ssh 10896 1726882181.96282: Set connection var ansible_timeout to 10 10896 1726882181.96285: Set connection var ansible_shell_type to sh 10896 1726882181.96287: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882181.96290: Set connection var ansible_shell_executable to /bin/sh 10896 1726882181.96292: Set connection var ansible_pipelining to False 10896 1726882181.96297: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.96299: variable 'ansible_connection' from source: unknown 10896 1726882181.96302: variable 'ansible_module_compression' from source: unknown 10896 1726882181.96304: variable 'ansible_shell_type' from source: unknown 10896 1726882181.96306: variable 'ansible_shell_executable' from source: unknown 10896 1726882181.96308: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.96310: variable 'ansible_pipelining' from source: unknown 10896 1726882181.96311: variable 'ansible_timeout' from source: unknown 10896 1726882181.96314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.96463: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882181.96467: variable 'omit' from source: magic vars 10896 1726882181.96469: starting attempt loop 10896 1726882181.96471: running the handler 10896 1726882181.96474: handler run complete 10896 1726882181.96476: attempt loop complete, returning result 10896 1726882181.96478: _execute() done 10896 1726882181.96480: dumping result to json 10896 1726882181.96483: done dumping result, returning 10896 1726882181.96501: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-8b02-b216-000000000447] 10896 1726882181.96504: sending task result for task 12673a56-9f93-8b02-b216-000000000447 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10896 1726882181.96646: no more pending results, returning what we have 10896 1726882181.96649: results queue empty 10896 1726882181.96651: checking for any_errors_fatal 10896 1726882181.96661: done checking for any_errors_fatal 10896 1726882181.96662: checking for max_fail_percentage 10896 1726882181.96664: done checking for max_fail_percentage 10896 1726882181.96665: checking to see if all hosts have failed and the running result is not ok 10896 1726882181.96666: done checking to see if all hosts have failed 10896 1726882181.96667: getting the remaining hosts for this loop 10896 1726882181.96668: done getting the remaining hosts for this loop 10896 1726882181.96672: getting the next task for host managed_node2 10896 1726882181.96682: done getting next task for host managed_node2 10896 1726882181.96685: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882181.96690: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882181.96697: getting variables 10896 1726882181.96699: in VariableManager get_vars() 10896 1726882181.96745: Calling all_inventory to load vars for managed_node2 10896 1726882181.96749: Calling groups_inventory to load vars for managed_node2 10896 1726882181.96752: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882181.96763: Calling all_plugins_play to load vars for managed_node2 10896 1726882181.96766: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882181.96768: Calling groups_plugins_play to load vars for managed_node2 10896 1726882181.97311: done sending task result for task 12673a56-9f93-8b02-b216-000000000447 10896 1726882181.97314: WORKER PROCESS EXITING 10896 1726882181.97855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882181.98787: done with get_vars() 10896 1726882181.98805: done getting variables 10896 1726882181.98846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882181.98930: variable 'profile' from source: include params 10896 1726882181.98933: variable 'item' from source: include params 10896 1726882181.98974: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:41 -0400 (0:00:00.042) 0:00:23.556 ****** 10896 1726882181.99003: entering _queue_task() for managed_node2/command 10896 1726882181.99288: worker is 1 (out of 1 available) 10896 1726882181.99303: exiting _queue_task() for managed_node2/command 10896 1726882181.99315: done queuing things up, now waiting for results queue to drain 10896 1726882181.99316: waiting for pending results... 10896 1726882181.99532: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 10896 1726882181.99648: in run() - task 12673a56-9f93-8b02-b216-000000000449 10896 1726882181.99658: variable 'ansible_search_path' from source: unknown 10896 1726882181.99662: variable 'ansible_search_path' from source: unknown 10896 1726882181.99699: calling self._execute() 10896 1726882181.99798: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882181.99807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882181.99874: variable 'omit' from source: magic vars 10896 1726882182.00176: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.00186: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.00500: variable 'profile_stat' from source: set_fact 10896 1726882182.00504: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882182.00507: when evaluation is False, skipping this task 10896 1726882182.00509: _execute() done 10896 1726882182.00511: dumping result to json 10896 1726882182.00513: done dumping result, returning 10896 1726882182.00515: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-8b02-b216-000000000449] 10896 1726882182.00517: sending task result for task 12673a56-9f93-8b02-b216-000000000449 10896 1726882182.00576: done sending task result for task 12673a56-9f93-8b02-b216-000000000449 10896 1726882182.00580: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882182.00631: no more pending results, returning what we have 10896 1726882182.00634: results queue empty 10896 1726882182.00635: checking for any_errors_fatal 10896 1726882182.00641: done checking for any_errors_fatal 10896 1726882182.00641: checking for max_fail_percentage 10896 1726882182.00643: done checking for max_fail_percentage 10896 1726882182.00644: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.00645: done checking to see if all hosts have failed 10896 1726882182.00645: getting the remaining hosts for this loop 10896 1726882182.00647: done getting the remaining hosts for this loop 10896 1726882182.00651: getting the next task for host managed_node2 10896 1726882182.00658: done getting next task for host managed_node2 10896 1726882182.00661: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10896 1726882182.00665: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.00669: getting variables 10896 1726882182.00670: in VariableManager get_vars() 10896 1726882182.00714: Calling all_inventory to load vars for managed_node2 10896 1726882182.00717: Calling groups_inventory to load vars for managed_node2 10896 1726882182.00719: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.00731: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.00734: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.00736: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.01769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.02617: done with get_vars() 10896 1726882182.02632: done getting variables 10896 1726882182.02677: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.02753: variable 'profile' from source: include params 10896 1726882182.02756: variable 'item' from source: include params 10896 1726882182.02799: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:42 -0400 (0:00:00.038) 0:00:23.595 ****** 10896 1726882182.02821: entering _queue_task() for managed_node2/set_fact 10896 1726882182.03049: worker is 1 (out of 1 available) 10896 1726882182.03062: exiting _queue_task() for managed_node2/set_fact 10896 1726882182.03074: done queuing things up, now waiting for results queue to drain 10896 1726882182.03075: waiting for pending results... 10896 1726882182.03526: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 10896 1726882182.03532: in run() - task 12673a56-9f93-8b02-b216-00000000044a 10896 1726882182.03536: variable 'ansible_search_path' from source: unknown 10896 1726882182.03539: variable 'ansible_search_path' from source: unknown 10896 1726882182.03542: calling self._execute() 10896 1726882182.03595: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.03604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.03614: variable 'omit' from source: magic vars 10896 1726882182.03990: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.04007: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.04133: variable 'profile_stat' from source: set_fact 10896 1726882182.04148: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882182.04152: when evaluation is False, skipping this task 10896 1726882182.04154: _execute() done 10896 1726882182.04157: dumping result to json 10896 1726882182.04173: done dumping result, returning 10896 1726882182.04176: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-8b02-b216-00000000044a] 10896 1726882182.04206: sending task result for task 12673a56-9f93-8b02-b216-00000000044a 10896 1726882182.04296: done sending task result for task 12673a56-9f93-8b02-b216-00000000044a 10896 1726882182.04300: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882182.04347: no more pending results, returning what we have 10896 1726882182.04351: results queue empty 10896 1726882182.04352: checking for any_errors_fatal 10896 1726882182.04357: done checking for any_errors_fatal 10896 1726882182.04358: checking for max_fail_percentage 10896 1726882182.04359: done checking for max_fail_percentage 10896 1726882182.04360: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.04361: done checking to see if all hosts have failed 10896 1726882182.04362: getting the remaining hosts for this loop 10896 1726882182.04363: done getting the remaining hosts for this loop 10896 1726882182.04367: getting the next task for host managed_node2 10896 1726882182.04374: done getting next task for host managed_node2 10896 1726882182.04376: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10896 1726882182.04387: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.04391: getting variables 10896 1726882182.04395: in VariableManager get_vars() 10896 1726882182.04432: Calling all_inventory to load vars for managed_node2 10896 1726882182.04435: Calling groups_inventory to load vars for managed_node2 10896 1726882182.04437: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.04445: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.04447: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.04450: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.05302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.09102: done with get_vars() 10896 1726882182.09119: done getting variables 10896 1726882182.09153: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.09220: variable 'profile' from source: include params 10896 1726882182.09223: variable 'item' from source: include params 10896 1726882182.09259: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:42 -0400 (0:00:00.064) 0:00:23.659 ****** 10896 1726882182.09279: entering _queue_task() for managed_node2/command 10896 1726882182.09529: worker is 1 (out of 1 available) 10896 1726882182.09542: exiting _queue_task() for managed_node2/command 10896 1726882182.09556: done queuing things up, now waiting for results queue to drain 10896 1726882182.09557: waiting for pending results... 10896 1726882182.09733: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 10896 1726882182.09816: in run() - task 12673a56-9f93-8b02-b216-00000000044b 10896 1726882182.09827: variable 'ansible_search_path' from source: unknown 10896 1726882182.09831: variable 'ansible_search_path' from source: unknown 10896 1726882182.09858: calling self._execute() 10896 1726882182.09934: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.09941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.09949: variable 'omit' from source: magic vars 10896 1726882182.10228: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.10237: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.10319: variable 'profile_stat' from source: set_fact 10896 1726882182.10335: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882182.10338: when evaluation is False, skipping this task 10896 1726882182.10341: _execute() done 10896 1726882182.10344: dumping result to json 10896 1726882182.10347: done dumping result, returning 10896 1726882182.10350: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-8b02-b216-00000000044b] 10896 1726882182.10352: sending task result for task 12673a56-9f93-8b02-b216-00000000044b 10896 1726882182.10433: done sending task result for task 12673a56-9f93-8b02-b216-00000000044b 10896 1726882182.10437: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882182.10484: no more pending results, returning what we have 10896 1726882182.10487: results queue empty 10896 1726882182.10488: checking for any_errors_fatal 10896 1726882182.10497: done checking for any_errors_fatal 10896 1726882182.10498: checking for max_fail_percentage 10896 1726882182.10500: done checking for max_fail_percentage 10896 1726882182.10501: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.10501: done checking to see if all hosts have failed 10896 1726882182.10502: getting the remaining hosts for this loop 10896 1726882182.10503: done getting the remaining hosts for this loop 10896 1726882182.10506: getting the next task for host managed_node2 10896 1726882182.10512: done getting next task for host managed_node2 10896 1726882182.10514: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10896 1726882182.10518: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.10522: getting variables 10896 1726882182.10523: in VariableManager get_vars() 10896 1726882182.10565: Calling all_inventory to load vars for managed_node2 10896 1726882182.10568: Calling groups_inventory to load vars for managed_node2 10896 1726882182.10570: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.10579: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.10581: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.10583: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.11321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.12177: done with get_vars() 10896 1726882182.12191: done getting variables 10896 1726882182.12235: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.12311: variable 'profile' from source: include params 10896 1726882182.12314: variable 'item' from source: include params 10896 1726882182.12352: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:42 -0400 (0:00:00.030) 0:00:23.690 ****** 10896 1726882182.12373: entering _queue_task() for managed_node2/set_fact 10896 1726882182.12588: worker is 1 (out of 1 available) 10896 1726882182.12603: exiting _queue_task() for managed_node2/set_fact 10896 1726882182.12617: done queuing things up, now waiting for results queue to drain 10896 1726882182.12618: waiting for pending results... 10896 1726882182.12780: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 10896 1726882182.12863: in run() - task 12673a56-9f93-8b02-b216-00000000044c 10896 1726882182.12874: variable 'ansible_search_path' from source: unknown 10896 1726882182.12877: variable 'ansible_search_path' from source: unknown 10896 1726882182.12909: calling self._execute() 10896 1726882182.12984: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.12987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.12999: variable 'omit' from source: magic vars 10896 1726882182.13256: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.13265: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.13349: variable 'profile_stat' from source: set_fact 10896 1726882182.13359: Evaluated conditional (profile_stat.stat.exists): False 10896 1726882182.13362: when evaluation is False, skipping this task 10896 1726882182.13366: _execute() done 10896 1726882182.13369: dumping result to json 10896 1726882182.13371: done dumping result, returning 10896 1726882182.13375: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-8b02-b216-00000000044c] 10896 1726882182.13380: sending task result for task 12673a56-9f93-8b02-b216-00000000044c 10896 1726882182.13466: done sending task result for task 12673a56-9f93-8b02-b216-00000000044c 10896 1726882182.13469: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10896 1726882182.13535: no more pending results, returning what we have 10896 1726882182.13538: results queue empty 10896 1726882182.13538: checking for any_errors_fatal 10896 1726882182.13541: done checking for any_errors_fatal 10896 1726882182.13542: checking for max_fail_percentage 10896 1726882182.13543: done checking for max_fail_percentage 10896 1726882182.13544: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.13545: done checking to see if all hosts have failed 10896 1726882182.13546: getting the remaining hosts for this loop 10896 1726882182.13547: done getting the remaining hosts for this loop 10896 1726882182.13550: getting the next task for host managed_node2 10896 1726882182.13557: done getting next task for host managed_node2 10896 1726882182.13560: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10896 1726882182.13562: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.13566: getting variables 10896 1726882182.13567: in VariableManager get_vars() 10896 1726882182.13600: Calling all_inventory to load vars for managed_node2 10896 1726882182.13602: Calling groups_inventory to load vars for managed_node2 10896 1726882182.13604: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.13613: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.13615: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.13618: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.14453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.15303: done with get_vars() 10896 1726882182.15318: done getting variables 10896 1726882182.15359: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.15442: variable 'profile' from source: include params 10896 1726882182.15445: variable 'item' from source: include params 10896 1726882182.15482: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:42 -0400 (0:00:00.031) 0:00:23.722 ****** 10896 1726882182.15508: entering _queue_task() for managed_node2/assert 10896 1726882182.15742: worker is 1 (out of 1 available) 10896 1726882182.15757: exiting _queue_task() for managed_node2/assert 10896 1726882182.15770: done queuing things up, now waiting for results queue to drain 10896 1726882182.15771: waiting for pending results... 10896 1726882182.15943: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' 10896 1726882182.16009: in run() - task 12673a56-9f93-8b02-b216-00000000026f 10896 1726882182.16022: variable 'ansible_search_path' from source: unknown 10896 1726882182.16026: variable 'ansible_search_path' from source: unknown 10896 1726882182.16054: calling self._execute() 10896 1726882182.16135: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.16140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.16148: variable 'omit' from source: magic vars 10896 1726882182.16415: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.16424: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.16431: variable 'omit' from source: magic vars 10896 1726882182.16457: variable 'omit' from source: magic vars 10896 1726882182.16527: variable 'profile' from source: include params 10896 1726882182.16531: variable 'item' from source: include params 10896 1726882182.16576: variable 'item' from source: include params 10896 1726882182.16590: variable 'omit' from source: magic vars 10896 1726882182.16625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.16652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.16668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.16681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.16691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.16719: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.16723: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.16725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.16791: Set connection var ansible_connection to ssh 10896 1726882182.16800: Set connection var ansible_timeout to 10 10896 1726882182.16803: Set connection var ansible_shell_type to sh 10896 1726882182.16810: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.16815: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.16821: Set connection var ansible_pipelining to False 10896 1726882182.16838: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.16841: variable 'ansible_connection' from source: unknown 10896 1726882182.16844: variable 'ansible_module_compression' from source: unknown 10896 1726882182.16847: variable 'ansible_shell_type' from source: unknown 10896 1726882182.16849: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.16851: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.16853: variable 'ansible_pipelining' from source: unknown 10896 1726882182.16856: variable 'ansible_timeout' from source: unknown 10896 1726882182.16861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.16961: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.16971: variable 'omit' from source: magic vars 10896 1726882182.16976: starting attempt loop 10896 1726882182.16981: running the handler 10896 1726882182.17054: variable 'lsr_net_profile_exists' from source: set_fact 10896 1726882182.17058: Evaluated conditional (lsr_net_profile_exists): True 10896 1726882182.17064: handler run complete 10896 1726882182.17075: attempt loop complete, returning result 10896 1726882182.17078: _execute() done 10896 1726882182.17081: dumping result to json 10896 1726882182.17084: done dumping result, returning 10896 1726882182.17091: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' [12673a56-9f93-8b02-b216-00000000026f] 10896 1726882182.17096: sending task result for task 12673a56-9f93-8b02-b216-00000000026f 10896 1726882182.17177: done sending task result for task 12673a56-9f93-8b02-b216-00000000026f 10896 1726882182.17180: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882182.17248: no more pending results, returning what we have 10896 1726882182.17251: results queue empty 10896 1726882182.17252: checking for any_errors_fatal 10896 1726882182.17257: done checking for any_errors_fatal 10896 1726882182.17258: checking for max_fail_percentage 10896 1726882182.17260: done checking for max_fail_percentage 10896 1726882182.17261: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.17261: done checking to see if all hosts have failed 10896 1726882182.17262: getting the remaining hosts for this loop 10896 1726882182.17263: done getting the remaining hosts for this loop 10896 1726882182.17267: getting the next task for host managed_node2 10896 1726882182.17272: done getting next task for host managed_node2 10896 1726882182.17274: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10896 1726882182.17277: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.17281: getting variables 10896 1726882182.17282: in VariableManager get_vars() 10896 1726882182.17322: Calling all_inventory to load vars for managed_node2 10896 1726882182.17324: Calling groups_inventory to load vars for managed_node2 10896 1726882182.17326: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.17335: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.17337: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.17340: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.18085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.18957: done with get_vars() 10896 1726882182.18972: done getting variables 10896 1726882182.19016: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.19101: variable 'profile' from source: include params 10896 1726882182.19104: variable 'item' from source: include params 10896 1726882182.19147: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:42 -0400 (0:00:00.036) 0:00:23.758 ****** 10896 1726882182.19174: entering _queue_task() for managed_node2/assert 10896 1726882182.19418: worker is 1 (out of 1 available) 10896 1726882182.19431: exiting _queue_task() for managed_node2/assert 10896 1726882182.19444: done queuing things up, now waiting for results queue to drain 10896 1726882182.19445: waiting for pending results... 10896 1726882182.19624: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 10896 1726882182.19695: in run() - task 12673a56-9f93-8b02-b216-000000000270 10896 1726882182.19710: variable 'ansible_search_path' from source: unknown 10896 1726882182.19714: variable 'ansible_search_path' from source: unknown 10896 1726882182.19742: calling self._execute() 10896 1726882182.19824: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.19828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.19836: variable 'omit' from source: magic vars 10896 1726882182.20107: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.20119: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.20122: variable 'omit' from source: magic vars 10896 1726882182.20155: variable 'omit' from source: magic vars 10896 1726882182.20227: variable 'profile' from source: include params 10896 1726882182.20230: variable 'item' from source: include params 10896 1726882182.20274: variable 'item' from source: include params 10896 1726882182.20288: variable 'omit' from source: magic vars 10896 1726882182.20324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.20352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.20369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.20382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.20394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.20421: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.20424: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.20427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.20498: Set connection var ansible_connection to ssh 10896 1726882182.20506: Set connection var ansible_timeout to 10 10896 1726882182.20509: Set connection var ansible_shell_type to sh 10896 1726882182.20515: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.20520: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.20525: Set connection var ansible_pipelining to False 10896 1726882182.20543: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.20547: variable 'ansible_connection' from source: unknown 10896 1726882182.20550: variable 'ansible_module_compression' from source: unknown 10896 1726882182.20553: variable 'ansible_shell_type' from source: unknown 10896 1726882182.20557: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.20559: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.20561: variable 'ansible_pipelining' from source: unknown 10896 1726882182.20564: variable 'ansible_timeout' from source: unknown 10896 1726882182.20566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.20664: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.20673: variable 'omit' from source: magic vars 10896 1726882182.20685: starting attempt loop 10896 1726882182.20688: running the handler 10896 1726882182.20757: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10896 1726882182.20760: Evaluated conditional (lsr_net_profile_ansible_managed): True 10896 1726882182.20766: handler run complete 10896 1726882182.20777: attempt loop complete, returning result 10896 1726882182.20780: _execute() done 10896 1726882182.20782: dumping result to json 10896 1726882182.20786: done dumping result, returning 10896 1726882182.20796: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12673a56-9f93-8b02-b216-000000000270] 10896 1726882182.20801: sending task result for task 12673a56-9f93-8b02-b216-000000000270 10896 1726882182.20879: done sending task result for task 12673a56-9f93-8b02-b216-000000000270 10896 1726882182.20882: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882182.20948: no more pending results, returning what we have 10896 1726882182.20951: results queue empty 10896 1726882182.20951: checking for any_errors_fatal 10896 1726882182.20960: done checking for any_errors_fatal 10896 1726882182.20961: checking for max_fail_percentage 10896 1726882182.20962: done checking for max_fail_percentage 10896 1726882182.20963: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.20964: done checking to see if all hosts have failed 10896 1726882182.20965: getting the remaining hosts for this loop 10896 1726882182.20966: done getting the remaining hosts for this loop 10896 1726882182.20969: getting the next task for host managed_node2 10896 1726882182.20976: done getting next task for host managed_node2 10896 1726882182.20978: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10896 1726882182.20981: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.20985: getting variables 10896 1726882182.20986: in VariableManager get_vars() 10896 1726882182.21029: Calling all_inventory to load vars for managed_node2 10896 1726882182.21032: Calling groups_inventory to load vars for managed_node2 10896 1726882182.21035: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.21043: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.21046: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.21048: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.21921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.22765: done with get_vars() 10896 1726882182.22780: done getting variables 10896 1726882182.22825: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882182.22907: variable 'profile' from source: include params 10896 1726882182.22910: variable 'item' from source: include params 10896 1726882182.22951: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:42 -0400 (0:00:00.038) 0:00:23.796 ****** 10896 1726882182.22978: entering _queue_task() for managed_node2/assert 10896 1726882182.23219: worker is 1 (out of 1 available) 10896 1726882182.23234: exiting _queue_task() for managed_node2/assert 10896 1726882182.23246: done queuing things up, now waiting for results queue to drain 10896 1726882182.23247: waiting for pending results... 10896 1726882182.23420: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 10896 1726882182.23489: in run() - task 12673a56-9f93-8b02-b216-000000000271 10896 1726882182.23505: variable 'ansible_search_path' from source: unknown 10896 1726882182.23509: variable 'ansible_search_path' from source: unknown 10896 1726882182.23537: calling self._execute() 10896 1726882182.23623: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.23627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.23637: variable 'omit' from source: magic vars 10896 1726882182.23905: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.23920: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.23924: variable 'omit' from source: magic vars 10896 1726882182.23951: variable 'omit' from source: magic vars 10896 1726882182.24026: variable 'profile' from source: include params 10896 1726882182.24029: variable 'item' from source: include params 10896 1726882182.24072: variable 'item' from source: include params 10896 1726882182.24086: variable 'omit' from source: magic vars 10896 1726882182.24121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.24149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.24164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.24178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.24186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.24213: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.24217: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.24219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.24288: Set connection var ansible_connection to ssh 10896 1726882182.24295: Set connection var ansible_timeout to 10 10896 1726882182.24300: Set connection var ansible_shell_type to sh 10896 1726882182.24308: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.24312: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.24317: Set connection var ansible_pipelining to False 10896 1726882182.24336: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.24339: variable 'ansible_connection' from source: unknown 10896 1726882182.24343: variable 'ansible_module_compression' from source: unknown 10896 1726882182.24345: variable 'ansible_shell_type' from source: unknown 10896 1726882182.24348: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.24350: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.24352: variable 'ansible_pipelining' from source: unknown 10896 1726882182.24356: variable 'ansible_timeout' from source: unknown 10896 1726882182.24359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.24458: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.24475: variable 'omit' from source: magic vars 10896 1726882182.24481: starting attempt loop 10896 1726882182.24484: running the handler 10896 1726882182.24550: variable 'lsr_net_profile_fingerprint' from source: set_fact 10896 1726882182.24553: Evaluated conditional (lsr_net_profile_fingerprint): True 10896 1726882182.24559: handler run complete 10896 1726882182.24569: attempt loop complete, returning result 10896 1726882182.24572: _execute() done 10896 1726882182.24574: dumping result to json 10896 1726882182.24579: done dumping result, returning 10896 1726882182.24588: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 [12673a56-9f93-8b02-b216-000000000271] 10896 1726882182.24591: sending task result for task 12673a56-9f93-8b02-b216-000000000271 10896 1726882182.24669: done sending task result for task 12673a56-9f93-8b02-b216-000000000271 10896 1726882182.24671: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 10896 1726882182.24742: no more pending results, returning what we have 10896 1726882182.24745: results queue empty 10896 1726882182.24746: checking for any_errors_fatal 10896 1726882182.24751: done checking for any_errors_fatal 10896 1726882182.24751: checking for max_fail_percentage 10896 1726882182.24753: done checking for max_fail_percentage 10896 1726882182.24754: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.24755: done checking to see if all hosts have failed 10896 1726882182.24755: getting the remaining hosts for this loop 10896 1726882182.24757: done getting the remaining hosts for this loop 10896 1726882182.24759: getting the next task for host managed_node2 10896 1726882182.24766: done getting next task for host managed_node2 10896 1726882182.24768: ^ task is: TASK: ** TEST check polling interval 10896 1726882182.24770: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.24774: getting variables 10896 1726882182.24776: in VariableManager get_vars() 10896 1726882182.24818: Calling all_inventory to load vars for managed_node2 10896 1726882182.24821: Calling groups_inventory to load vars for managed_node2 10896 1726882182.24824: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.24832: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.24835: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.24837: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.25584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.26542: done with get_vars() 10896 1726882182.26556: done getting variables 10896 1726882182.26596: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Friday 20 September 2024 21:29:42 -0400 (0:00:00.036) 0:00:23.833 ****** 10896 1726882182.26616: entering _queue_task() for managed_node2/command 10896 1726882182.26838: worker is 1 (out of 1 available) 10896 1726882182.26850: exiting _queue_task() for managed_node2/command 10896 1726882182.26863: done queuing things up, now waiting for results queue to drain 10896 1726882182.26865: waiting for pending results... 10896 1726882182.27034: running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval 10896 1726882182.27087: in run() - task 12673a56-9f93-8b02-b216-000000000071 10896 1726882182.27104: variable 'ansible_search_path' from source: unknown 10896 1726882182.27131: calling self._execute() 10896 1726882182.27212: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.27217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.27224: variable 'omit' from source: magic vars 10896 1726882182.27497: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.27508: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.27513: variable 'omit' from source: magic vars 10896 1726882182.27531: variable 'omit' from source: magic vars 10896 1726882182.27598: variable 'controller_device' from source: play vars 10896 1726882182.27613: variable 'omit' from source: magic vars 10896 1726882182.27647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.27672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.27687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.27705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.27715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.27738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.27741: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.27744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.27815: Set connection var ansible_connection to ssh 10896 1726882182.27818: Set connection var ansible_timeout to 10 10896 1726882182.27821: Set connection var ansible_shell_type to sh 10896 1726882182.27828: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.27833: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.27838: Set connection var ansible_pipelining to False 10896 1726882182.27861: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.27864: variable 'ansible_connection' from source: unknown 10896 1726882182.27867: variable 'ansible_module_compression' from source: unknown 10896 1726882182.27869: variable 'ansible_shell_type' from source: unknown 10896 1726882182.27872: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.27874: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.27876: variable 'ansible_pipelining' from source: unknown 10896 1726882182.27879: variable 'ansible_timeout' from source: unknown 10896 1726882182.27881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.27976: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.27987: variable 'omit' from source: magic vars 10896 1726882182.27995: starting attempt loop 10896 1726882182.27999: running the handler 10896 1726882182.28013: _low_level_execute_command(): starting 10896 1726882182.28020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882182.28550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.28554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.28558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882182.28561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.28600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.28604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.28611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.28683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.30381: stdout chunk (state=3): >>>/root <<< 10896 1726882182.30485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.30517: stderr chunk (state=3): >>><<< 10896 1726882182.30520: stdout chunk (state=3): >>><<< 10896 1726882182.30543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.30555: _low_level_execute_command(): starting 10896 1726882182.30562: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278 `" && echo ansible-tmp-1726882182.305429-12145-58927421903278="` echo /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278 `" ) && sleep 0' 10896 1726882182.31000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.31004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882182.31029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.31032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.31043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.31054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.31105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.31109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.31123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.31179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.33050: stdout chunk (state=3): >>>ansible-tmp-1726882182.305429-12145-58927421903278=/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278 <<< 10896 1726882182.33153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.33178: stderr chunk (state=3): >>><<< 10896 1726882182.33181: stdout chunk (state=3): >>><<< 10896 1726882182.33200: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882182.305429-12145-58927421903278=/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.33231: variable 'ansible_module_compression' from source: unknown 10896 1726882182.33271: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882182.33304: variable 'ansible_facts' from source: unknown 10896 1726882182.33361: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py 10896 1726882182.33463: Sending initial data 10896 1726882182.33466: Sent initial data (154 bytes) 10896 1726882182.33884: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.33922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882182.33925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.33927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.33930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.33932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882182.33934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.33981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.33988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.34050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.35565: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 10896 1726882182.35569: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882182.35623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882182.35684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpxcrcmx6c /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py <<< 10896 1726882182.35689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py" <<< 10896 1726882182.35742: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpxcrcmx6c" to remote "/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py" <<< 10896 1726882182.35746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py" <<< 10896 1726882182.36340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.36375: stderr chunk (state=3): >>><<< 10896 1726882182.36379: stdout chunk (state=3): >>><<< 10896 1726882182.36396: done transferring module to remote 10896 1726882182.36407: _low_level_execute_command(): starting 10896 1726882182.36410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/ /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py && sleep 0' 10896 1726882182.36838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.36841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.36843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.36845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.36851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.36900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.36904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.36969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.38677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.38702: stderr chunk (state=3): >>><<< 10896 1726882182.38705: stdout chunk (state=3): >>><<< 10896 1726882182.38717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.38720: _low_level_execute_command(): starting 10896 1726882182.38724: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/AnsiballZ_command.py && sleep 0' 10896 1726882182.39132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.39135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.39138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.39140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882182.39142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.39190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.39198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.39260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.54531: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 21:29:42.540821", "end": "2024-09-20 21:29:42.544009", "delta": "0:00:00.003188", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882182.55910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882182.55934: stderr chunk (state=3): >>><<< 10896 1726882182.55938: stdout chunk (state=3): >>><<< 10896 1726882182.55954: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 21:29:42.540821", "end": "2024-09-20 21:29:42.544009", "delta": "0:00:00.003188", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882182.55982: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882182.55988: _low_level_execute_command(): starting 10896 1726882182.55994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882182.305429-12145-58927421903278/ > /dev/null 2>&1 && sleep 0' 10896 1726882182.56434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882182.56437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.56444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.56446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.56448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.56494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.56499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.56568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.58364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.58383: stderr chunk (state=3): >>><<< 10896 1726882182.58386: stdout chunk (state=3): >>><<< 10896 1726882182.58402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.58408: handler run complete 10896 1726882182.58425: Evaluated conditional (False): False 10896 1726882182.58545: variable 'result' from source: unknown 10896 1726882182.58557: Evaluated conditional ('110' in result.stdout): True 10896 1726882182.58571: attempt loop complete, returning result 10896 1726882182.58574: _execute() done 10896 1726882182.58577: dumping result to json 10896 1726882182.58579: done dumping result, returning 10896 1726882182.58587: done running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval [12673a56-9f93-8b02-b216-000000000071] 10896 1726882182.58592: sending task result for task 12673a56-9f93-8b02-b216-000000000071 10896 1726882182.58687: done sending task result for task 12673a56-9f93-8b02-b216-000000000071 10896 1726882182.58689: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003188", "end": "2024-09-20 21:29:42.544009", "rc": 0, "start": "2024-09-20 21:29:42.540821" } STDOUT: MII Polling Interval (ms): 110 10896 1726882182.58757: no more pending results, returning what we have 10896 1726882182.58759: results queue empty 10896 1726882182.58760: checking for any_errors_fatal 10896 1726882182.58768: done checking for any_errors_fatal 10896 1726882182.58768: checking for max_fail_percentage 10896 1726882182.58770: done checking for max_fail_percentage 10896 1726882182.58771: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.58772: done checking to see if all hosts have failed 10896 1726882182.58772: getting the remaining hosts for this loop 10896 1726882182.58774: done getting the remaining hosts for this loop 10896 1726882182.58777: getting the next task for host managed_node2 10896 1726882182.58783: done getting next task for host managed_node2 10896 1726882182.58786: ^ task is: TASK: ** TEST check IPv4 10896 1726882182.58788: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.58792: getting variables 10896 1726882182.58800: in VariableManager get_vars() 10896 1726882182.58844: Calling all_inventory to load vars for managed_node2 10896 1726882182.58847: Calling groups_inventory to load vars for managed_node2 10896 1726882182.58849: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.58858: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.58861: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.58863: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.59943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.61290: done with get_vars() 10896 1726882182.61312: done getting variables 10896 1726882182.61353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Friday 20 September 2024 21:29:42 -0400 (0:00:00.347) 0:00:24.180 ****** 10896 1726882182.61374: entering _queue_task() for managed_node2/command 10896 1726882182.61608: worker is 1 (out of 1 available) 10896 1726882182.61620: exiting _queue_task() for managed_node2/command 10896 1726882182.61632: done queuing things up, now waiting for results queue to drain 10896 1726882182.61633: waiting for pending results... 10896 1726882182.61802: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 10896 1726882182.61866: in run() - task 12673a56-9f93-8b02-b216-000000000072 10896 1726882182.61873: variable 'ansible_search_path' from source: unknown 10896 1726882182.61905: calling self._execute() 10896 1726882182.61986: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.61990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.62002: variable 'omit' from source: magic vars 10896 1726882182.62269: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.62279: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.62284: variable 'omit' from source: magic vars 10896 1726882182.62304: variable 'omit' from source: magic vars 10896 1726882182.62366: variable 'controller_device' from source: play vars 10896 1726882182.62381: variable 'omit' from source: magic vars 10896 1726882182.62417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.62448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.62464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.62477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.62487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.62514: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.62517: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.62520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.62586: Set connection var ansible_connection to ssh 10896 1726882182.62590: Set connection var ansible_timeout to 10 10896 1726882182.62596: Set connection var ansible_shell_type to sh 10896 1726882182.62603: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.62608: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.62613: Set connection var ansible_pipelining to False 10896 1726882182.62632: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.62634: variable 'ansible_connection' from source: unknown 10896 1726882182.62638: variable 'ansible_module_compression' from source: unknown 10896 1726882182.62640: variable 'ansible_shell_type' from source: unknown 10896 1726882182.62643: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.62645: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.62647: variable 'ansible_pipelining' from source: unknown 10896 1726882182.62650: variable 'ansible_timeout' from source: unknown 10896 1726882182.62652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.62750: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.62760: variable 'omit' from source: magic vars 10896 1726882182.62765: starting attempt loop 10896 1726882182.62768: running the handler 10896 1726882182.62802: _low_level_execute_command(): starting 10896 1726882182.62936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882182.63515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.63539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.63556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.63579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.63678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.65248: stdout chunk (state=3): >>>/root <<< 10896 1726882182.65344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.65369: stderr chunk (state=3): >>><<< 10896 1726882182.65372: stdout chunk (state=3): >>><<< 10896 1726882182.65390: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.65406: _low_level_execute_command(): starting 10896 1726882182.65410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392 `" && echo ansible-tmp-1726882182.6538985-12167-106631896259392="` echo /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392 `" ) && sleep 0' 10896 1726882182.65810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.65814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.65817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.65826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.65866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.65870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.65937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.67788: stdout chunk (state=3): >>>ansible-tmp-1726882182.6538985-12167-106631896259392=/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392 <<< 10896 1726882182.67899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.67921: stderr chunk (state=3): >>><<< 10896 1726882182.67924: stdout chunk (state=3): >>><<< 10896 1726882182.67940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882182.6538985-12167-106631896259392=/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.67967: variable 'ansible_module_compression' from source: unknown 10896 1726882182.68019: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882182.68050: variable 'ansible_facts' from source: unknown 10896 1726882182.68109: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py 10896 1726882182.68210: Sending initial data 10896 1726882182.68213: Sent initial data (156 bytes) 10896 1726882182.68656: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.68659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.68662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.68664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.68666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.68720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.68723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.68727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.68786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.70311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882182.70371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882182.70430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpsy6353od /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py <<< 10896 1726882182.70434: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py" <<< 10896 1726882182.70489: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpsy6353od" to remote "/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py" <<< 10896 1726882182.71095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.71130: stderr chunk (state=3): >>><<< 10896 1726882182.71134: stdout chunk (state=3): >>><<< 10896 1726882182.71179: done transferring module to remote 10896 1726882182.71187: _low_level_execute_command(): starting 10896 1726882182.71192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/ /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py && sleep 0' 10896 1726882182.71621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.71624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.71626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.71628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.71675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.71678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.71749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.73458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.73480: stderr chunk (state=3): >>><<< 10896 1726882182.73483: stdout chunk (state=3): >>><<< 10896 1726882182.73498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.73501: _low_level_execute_command(): starting 10896 1726882182.73504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/AnsiballZ_command.py && sleep 0' 10896 1726882182.73900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.73921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.73925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.73971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.73974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.74045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.89297: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.127/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:29:42.888225", "end": "2024-09-20 21:29:42.891712", "delta": "0:00:00.003487", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882182.90649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882182.90674: stderr chunk (state=3): >>><<< 10896 1726882182.90677: stdout chunk (state=3): >>><<< 10896 1726882182.90698: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.127/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:29:42.888225", "end": "2024-09-20 21:29:42.891712", "delta": "0:00:00.003487", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882182.90726: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882182.90736: _low_level_execute_command(): starting 10896 1726882182.90738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882182.6538985-12167-106631896259392/ > /dev/null 2>&1 && sleep 0' 10896 1726882182.91177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.91214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.91223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.91226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.91228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.91231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.91274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.91280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.91283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.91346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.93129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.93156: stderr chunk (state=3): >>><<< 10896 1726882182.93159: stdout chunk (state=3): >>><<< 10896 1726882182.93174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.93179: handler run complete 10896 1726882182.93200: Evaluated conditional (False): False 10896 1726882182.93311: variable 'result' from source: set_fact 10896 1726882182.93324: Evaluated conditional ('192.0.2' in result.stdout): True 10896 1726882182.93334: attempt loop complete, returning result 10896 1726882182.93337: _execute() done 10896 1726882182.93339: dumping result to json 10896 1726882182.93344: done dumping result, returning 10896 1726882182.93351: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [12673a56-9f93-8b02-b216-000000000072] 10896 1726882182.93361: sending task result for task 12673a56-9f93-8b02-b216-000000000072 10896 1726882182.93448: done sending task result for task 12673a56-9f93-8b02-b216-000000000072 10896 1726882182.93450: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003487", "end": "2024-09-20 21:29:42.891712", "rc": 0, "start": "2024-09-20 21:29:42.888225" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.127/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 237sec preferred_lft 237sec 10896 1726882182.93555: no more pending results, returning what we have 10896 1726882182.93559: results queue empty 10896 1726882182.93559: checking for any_errors_fatal 10896 1726882182.93565: done checking for any_errors_fatal 10896 1726882182.93566: checking for max_fail_percentage 10896 1726882182.93568: done checking for max_fail_percentage 10896 1726882182.93568: checking to see if all hosts have failed and the running result is not ok 10896 1726882182.93569: done checking to see if all hosts have failed 10896 1726882182.93570: getting the remaining hosts for this loop 10896 1726882182.93578: done getting the remaining hosts for this loop 10896 1726882182.93582: getting the next task for host managed_node2 10896 1726882182.93587: done getting next task for host managed_node2 10896 1726882182.93590: ^ task is: TASK: ** TEST check IPv6 10896 1726882182.93592: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882182.93597: getting variables 10896 1726882182.93598: in VariableManager get_vars() 10896 1726882182.93634: Calling all_inventory to load vars for managed_node2 10896 1726882182.93637: Calling groups_inventory to load vars for managed_node2 10896 1726882182.93639: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882182.93648: Calling all_plugins_play to load vars for managed_node2 10896 1726882182.93650: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882182.93653: Calling groups_plugins_play to load vars for managed_node2 10896 1726882182.94547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882182.95381: done with get_vars() 10896 1726882182.95398: done getting variables 10896 1726882182.95439: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Friday 20 September 2024 21:29:42 -0400 (0:00:00.340) 0:00:24.521 ****** 10896 1726882182.95460: entering _queue_task() for managed_node2/command 10896 1726882182.95678: worker is 1 (out of 1 available) 10896 1726882182.95690: exiting _queue_task() for managed_node2/command 10896 1726882182.95704: done queuing things up, now waiting for results queue to drain 10896 1726882182.95705: waiting for pending results... 10896 1726882182.95870: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 10896 1726882182.95937: in run() - task 12673a56-9f93-8b02-b216-000000000073 10896 1726882182.95946: variable 'ansible_search_path' from source: unknown 10896 1726882182.95975: calling self._execute() 10896 1726882182.96057: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.96061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.96070: variable 'omit' from source: magic vars 10896 1726882182.96351: variable 'ansible_distribution_major_version' from source: facts 10896 1726882182.96361: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882182.96364: variable 'omit' from source: magic vars 10896 1726882182.96382: variable 'omit' from source: magic vars 10896 1726882182.96451: variable 'controller_device' from source: play vars 10896 1726882182.96464: variable 'omit' from source: magic vars 10896 1726882182.96502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882182.96528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882182.96545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882182.96557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.96567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882182.96590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882182.96600: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.96603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.96668: Set connection var ansible_connection to ssh 10896 1726882182.96672: Set connection var ansible_timeout to 10 10896 1726882182.96675: Set connection var ansible_shell_type to sh 10896 1726882182.96682: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882182.96687: Set connection var ansible_shell_executable to /bin/sh 10896 1726882182.96691: Set connection var ansible_pipelining to False 10896 1726882182.96716: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.96719: variable 'ansible_connection' from source: unknown 10896 1726882182.96723: variable 'ansible_module_compression' from source: unknown 10896 1726882182.96725: variable 'ansible_shell_type' from source: unknown 10896 1726882182.96727: variable 'ansible_shell_executable' from source: unknown 10896 1726882182.96730: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882182.96732: variable 'ansible_pipelining' from source: unknown 10896 1726882182.96734: variable 'ansible_timeout' from source: unknown 10896 1726882182.96738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882182.96840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882182.96849: variable 'omit' from source: magic vars 10896 1726882182.96855: starting attempt loop 10896 1726882182.96859: running the handler 10896 1726882182.96870: _low_level_execute_command(): starting 10896 1726882182.96877: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882182.97389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882182.97396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882182.97399: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.97448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882182.97451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.97457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.97521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882182.99104: stdout chunk (state=3): >>>/root <<< 10896 1726882182.99199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882182.99229: stderr chunk (state=3): >>><<< 10896 1726882182.99233: stdout chunk (state=3): >>><<< 10896 1726882182.99253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882182.99265: _low_level_execute_command(): starting 10896 1726882182.99271: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821 `" && echo ansible-tmp-1726882182.9925244-12176-231983590160821="` echo /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821 `" ) && sleep 0' 10896 1726882182.99712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882182.99715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882182.99718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882182.99728: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882182.99764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882182.99776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882182.99840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.01696: stdout chunk (state=3): >>>ansible-tmp-1726882182.9925244-12176-231983590160821=/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821 <<< 10896 1726882183.01809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.01835: stderr chunk (state=3): >>><<< 10896 1726882183.01838: stdout chunk (state=3): >>><<< 10896 1726882183.01854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882182.9925244-12176-231983590160821=/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.01882: variable 'ansible_module_compression' from source: unknown 10896 1726882183.01929: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882183.01960: variable 'ansible_facts' from source: unknown 10896 1726882183.02021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py 10896 1726882183.02120: Sending initial data 10896 1726882183.02124: Sent initial data (156 bytes) 10896 1726882183.02575: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.02578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882183.02581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.02583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.02586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.02635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.02638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.02712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.04226: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882183.04283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882183.04343: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp7wk586cr /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py <<< 10896 1726882183.04346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py" <<< 10896 1726882183.04409: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp7wk586cr" to remote "/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py" <<< 10896 1726882183.04411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py" <<< 10896 1726882183.05021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.05064: stderr chunk (state=3): >>><<< 10896 1726882183.05067: stdout chunk (state=3): >>><<< 10896 1726882183.05107: done transferring module to remote 10896 1726882183.05117: _low_level_execute_command(): starting 10896 1726882183.05122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/ /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py && sleep 0' 10896 1726882183.05576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.05579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882183.05582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.05584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.05586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.05649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.05651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.05707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.07407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.07432: stderr chunk (state=3): >>><<< 10896 1726882183.07436: stdout chunk (state=3): >>><<< 10896 1726882183.07453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.07457: _low_level_execute_command(): starting 10896 1726882183.07463: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/AnsiballZ_command.py && sleep 0' 10896 1726882183.07892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.07907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.07925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.07928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.07977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.07980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.08054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.23209: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::145/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::58e7:9cff:fe78:4279/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::58e7:9cff:fe78:4279/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:29:43.227342", "end": "2024-09-20 21:29:43.230750", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882183.24597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882183.24624: stderr chunk (state=3): >>><<< 10896 1726882183.24628: stdout chunk (state=3): >>><<< 10896 1726882183.24647: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::145/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::58e7:9cff:fe78:4279/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::58e7:9cff:fe78:4279/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:29:43.227342", "end": "2024-09-20 21:29:43.230750", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882183.24677: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882183.24684: _low_level_execute_command(): starting 10896 1726882183.24689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882182.9925244-12176-231983590160821/ > /dev/null 2>&1 && sleep 0' 10896 1726882183.25144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.25147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882183.25149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.25151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.25153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.25215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.25219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.25274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.27085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.27114: stderr chunk (state=3): >>><<< 10896 1726882183.27117: stdout chunk (state=3): >>><<< 10896 1726882183.27129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.27135: handler run complete 10896 1726882183.27155: Evaluated conditional (False): False 10896 1726882183.27266: variable 'result' from source: set_fact 10896 1726882183.27279: Evaluated conditional ('2001' in result.stdout): True 10896 1726882183.27289: attempt loop complete, returning result 10896 1726882183.27296: _execute() done 10896 1726882183.27299: dumping result to json 10896 1726882183.27305: done dumping result, returning 10896 1726882183.27312: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [12673a56-9f93-8b02-b216-000000000073] 10896 1726882183.27317: sending task result for task 12673a56-9f93-8b02-b216-000000000073 10896 1726882183.27414: done sending task result for task 12673a56-9f93-8b02-b216-000000000073 10896 1726882183.27416: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003408", "end": "2024-09-20 21:29:43.230750", "rc": 0, "start": "2024-09-20 21:29:43.227342" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::145/128 scope global dynamic noprefixroute valid_lft 237sec preferred_lft 237sec inet6 2001:db8::58e7:9cff:fe78:4279/64 scope global dynamic noprefixroute valid_lft 1798sec preferred_lft 1798sec inet6 fe80::58e7:9cff:fe78:4279/64 scope link noprefixroute valid_lft forever preferred_lft forever 10896 1726882183.27485: no more pending results, returning what we have 10896 1726882183.27488: results queue empty 10896 1726882183.27489: checking for any_errors_fatal 10896 1726882183.27500: done checking for any_errors_fatal 10896 1726882183.27500: checking for max_fail_percentage 10896 1726882183.27502: done checking for max_fail_percentage 10896 1726882183.27503: checking to see if all hosts have failed and the running result is not ok 10896 1726882183.27504: done checking to see if all hosts have failed 10896 1726882183.27504: getting the remaining hosts for this loop 10896 1726882183.27506: done getting the remaining hosts for this loop 10896 1726882183.27510: getting the next task for host managed_node2 10896 1726882183.27520: done getting next task for host managed_node2 10896 1726882183.27525: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10896 1726882183.27529: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882183.27546: getting variables 10896 1726882183.27548: in VariableManager get_vars() 10896 1726882183.27584: Calling all_inventory to load vars for managed_node2 10896 1726882183.27586: Calling groups_inventory to load vars for managed_node2 10896 1726882183.27588: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.27606: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.27609: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.27612: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.28380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.29243: done with get_vars() 10896 1726882183.29259: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:29:43 -0400 (0:00:00.338) 0:00:24.860 ****** 10896 1726882183.29333: entering _queue_task() for managed_node2/include_tasks 10896 1726882183.29552: worker is 1 (out of 1 available) 10896 1726882183.29564: exiting _queue_task() for managed_node2/include_tasks 10896 1726882183.29576: done queuing things up, now waiting for results queue to drain 10896 1726882183.29578: waiting for pending results... 10896 1726882183.29744: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10896 1726882183.29846: in run() - task 12673a56-9f93-8b02-b216-00000000007d 10896 1726882183.29858: variable 'ansible_search_path' from source: unknown 10896 1726882183.29862: variable 'ansible_search_path' from source: unknown 10896 1726882183.29889: calling self._execute() 10896 1726882183.29966: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.29970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.29978: variable 'omit' from source: magic vars 10896 1726882183.30248: variable 'ansible_distribution_major_version' from source: facts 10896 1726882183.30255: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882183.30261: _execute() done 10896 1726882183.30264: dumping result to json 10896 1726882183.30268: done dumping result, returning 10896 1726882183.30274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-8b02-b216-00000000007d] 10896 1726882183.30279: sending task result for task 12673a56-9f93-8b02-b216-00000000007d 10896 1726882183.30364: done sending task result for task 12673a56-9f93-8b02-b216-00000000007d 10896 1726882183.30366: WORKER PROCESS EXITING 10896 1726882183.30411: no more pending results, returning what we have 10896 1726882183.30416: in VariableManager get_vars() 10896 1726882183.30457: Calling all_inventory to load vars for managed_node2 10896 1726882183.30460: Calling groups_inventory to load vars for managed_node2 10896 1726882183.30462: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.30470: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.30473: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.30476: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.31428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.32347: done with get_vars() 10896 1726882183.32368: variable 'ansible_search_path' from source: unknown 10896 1726882183.32369: variable 'ansible_search_path' from source: unknown 10896 1726882183.32412: we have included files to process 10896 1726882183.32414: generating all_blocks data 10896 1726882183.32416: done generating all_blocks data 10896 1726882183.32421: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882183.32422: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882183.32424: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10896 1726882183.33045: done processing included file 10896 1726882183.33047: iterating over new_blocks loaded from include file 10896 1726882183.33048: in VariableManager get_vars() 10896 1726882183.33080: done with get_vars() 10896 1726882183.33082: filtering new block on tags 10896 1726882183.33114: done filtering new block on tags 10896 1726882183.33117: in VariableManager get_vars() 10896 1726882183.33138: done with get_vars() 10896 1726882183.33140: filtering new block on tags 10896 1726882183.33187: done filtering new block on tags 10896 1726882183.33189: in VariableManager get_vars() 10896 1726882183.33215: done with get_vars() 10896 1726882183.33217: filtering new block on tags 10896 1726882183.33251: done filtering new block on tags 10896 1726882183.33253: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 10896 1726882183.33258: extending task lists for all hosts with included blocks 10896 1726882183.34386: done extending task lists 10896 1726882183.34387: done processing included files 10896 1726882183.34388: results queue empty 10896 1726882183.34389: checking for any_errors_fatal 10896 1726882183.34396: done checking for any_errors_fatal 10896 1726882183.34397: checking for max_fail_percentage 10896 1726882183.34399: done checking for max_fail_percentage 10896 1726882183.34399: checking to see if all hosts have failed and the running result is not ok 10896 1726882183.34400: done checking to see if all hosts have failed 10896 1726882183.34401: getting the remaining hosts for this loop 10896 1726882183.34402: done getting the remaining hosts for this loop 10896 1726882183.34405: getting the next task for host managed_node2 10896 1726882183.34409: done getting next task for host managed_node2 10896 1726882183.34412: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10896 1726882183.34415: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882183.34425: getting variables 10896 1726882183.34426: in VariableManager get_vars() 10896 1726882183.34441: Calling all_inventory to load vars for managed_node2 10896 1726882183.34444: Calling groups_inventory to load vars for managed_node2 10896 1726882183.34446: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.34452: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.34454: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.34456: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.35245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.36154: done with get_vars() 10896 1726882183.36168: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:29:43 -0400 (0:00:00.068) 0:00:24.929 ****** 10896 1726882183.36226: entering _queue_task() for managed_node2/setup 10896 1726882183.36483: worker is 1 (out of 1 available) 10896 1726882183.36498: exiting _queue_task() for managed_node2/setup 10896 1726882183.36510: done queuing things up, now waiting for results queue to drain 10896 1726882183.36511: waiting for pending results... 10896 1726882183.36812: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10896 1726882183.36911: in run() - task 12673a56-9f93-8b02-b216-000000000494 10896 1726882183.36933: variable 'ansible_search_path' from source: unknown 10896 1726882183.37001: variable 'ansible_search_path' from source: unknown 10896 1726882183.37006: calling self._execute() 10896 1726882183.37075: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.37089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.37109: variable 'omit' from source: magic vars 10896 1726882183.37482: variable 'ansible_distribution_major_version' from source: facts 10896 1726882183.37503: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882183.37714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882183.39760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882183.39842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882183.40000: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882183.40003: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882183.40005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882183.40032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882183.40064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882183.40097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882183.40141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882183.40158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882183.40218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882183.40246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882183.40274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882183.40320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882183.40337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882183.40500: variable '__network_required_facts' from source: role '' defaults 10896 1726882183.40515: variable 'ansible_facts' from source: unknown 10896 1726882183.41255: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10896 1726882183.41265: when evaluation is False, skipping this task 10896 1726882183.41272: _execute() done 10896 1726882183.41280: dumping result to json 10896 1726882183.41289: done dumping result, returning 10896 1726882183.41308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-8b02-b216-000000000494] 10896 1726882183.41319: sending task result for task 12673a56-9f93-8b02-b216-000000000494 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882183.41483: no more pending results, returning what we have 10896 1726882183.41486: results queue empty 10896 1726882183.41487: checking for any_errors_fatal 10896 1726882183.41489: done checking for any_errors_fatal 10896 1726882183.41489: checking for max_fail_percentage 10896 1726882183.41491: done checking for max_fail_percentage 10896 1726882183.41492: checking to see if all hosts have failed and the running result is not ok 10896 1726882183.41494: done checking to see if all hosts have failed 10896 1726882183.41495: getting the remaining hosts for this loop 10896 1726882183.41497: done getting the remaining hosts for this loop 10896 1726882183.41501: getting the next task for host managed_node2 10896 1726882183.41510: done getting next task for host managed_node2 10896 1726882183.41513: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10896 1726882183.41518: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882183.41538: getting variables 10896 1726882183.41539: in VariableManager get_vars() 10896 1726882183.41728: Calling all_inventory to load vars for managed_node2 10896 1726882183.41731: Calling groups_inventory to load vars for managed_node2 10896 1726882183.41733: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.41742: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.41745: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.41748: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.42283: done sending task result for task 12673a56-9f93-8b02-b216-000000000494 10896 1726882183.42286: WORKER PROCESS EXITING 10896 1726882183.43087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.44686: done with get_vars() 10896 1726882183.44714: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:29:43 -0400 (0:00:00.086) 0:00:25.015 ****** 10896 1726882183.44846: entering _queue_task() for managed_node2/stat 10896 1726882183.45211: worker is 1 (out of 1 available) 10896 1726882183.45226: exiting _queue_task() for managed_node2/stat 10896 1726882183.45239: done queuing things up, now waiting for results queue to drain 10896 1726882183.45240: waiting for pending results... 10896 1726882183.45571: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10896 1726882183.45771: in run() - task 12673a56-9f93-8b02-b216-000000000496 10896 1726882183.45775: variable 'ansible_search_path' from source: unknown 10896 1726882183.45778: variable 'ansible_search_path' from source: unknown 10896 1726882183.45786: calling self._execute() 10896 1726882183.46145: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.46151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.46162: variable 'omit' from source: magic vars 10896 1726882183.46577: variable 'ansible_distribution_major_version' from source: facts 10896 1726882183.46585: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882183.46752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882183.47024: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882183.47071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882183.47108: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882183.47145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882183.47238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882183.47270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882183.47387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882183.47390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882183.47499: variable '__network_is_ostree' from source: set_fact 10896 1726882183.47502: Evaluated conditional (not __network_is_ostree is defined): False 10896 1726882183.47505: when evaluation is False, skipping this task 10896 1726882183.47507: _execute() done 10896 1726882183.47509: dumping result to json 10896 1726882183.47511: done dumping result, returning 10896 1726882183.47519: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-8b02-b216-000000000496] 10896 1726882183.47525: sending task result for task 12673a56-9f93-8b02-b216-000000000496 10896 1726882183.47656: done sending task result for task 12673a56-9f93-8b02-b216-000000000496 10896 1726882183.47659: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10896 1726882183.47758: no more pending results, returning what we have 10896 1726882183.47762: results queue empty 10896 1726882183.47763: checking for any_errors_fatal 10896 1726882183.47773: done checking for any_errors_fatal 10896 1726882183.47775: checking for max_fail_percentage 10896 1726882183.47777: done checking for max_fail_percentage 10896 1726882183.47778: checking to see if all hosts have failed and the running result is not ok 10896 1726882183.47779: done checking to see if all hosts have failed 10896 1726882183.47780: getting the remaining hosts for this loop 10896 1726882183.47782: done getting the remaining hosts for this loop 10896 1726882183.47786: getting the next task for host managed_node2 10896 1726882183.47796: done getting next task for host managed_node2 10896 1726882183.47800: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10896 1726882183.47806: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882183.47826: getting variables 10896 1726882183.47828: in VariableManager get_vars() 10896 1726882183.47873: Calling all_inventory to load vars for managed_node2 10896 1726882183.47877: Calling groups_inventory to load vars for managed_node2 10896 1726882183.47879: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.48109: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.48114: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.48118: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.49788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.51449: done with get_vars() 10896 1726882183.51481: done getting variables 10896 1726882183.51548: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:29:43 -0400 (0:00:00.067) 0:00:25.082 ****** 10896 1726882183.51591: entering _queue_task() for managed_node2/set_fact 10896 1726882183.51940: worker is 1 (out of 1 available) 10896 1726882183.51960: exiting _queue_task() for managed_node2/set_fact 10896 1726882183.51971: done queuing things up, now waiting for results queue to drain 10896 1726882183.51973: waiting for pending results... 10896 1726882183.52407: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10896 1726882183.52463: in run() - task 12673a56-9f93-8b02-b216-000000000497 10896 1726882183.52483: variable 'ansible_search_path' from source: unknown 10896 1726882183.52501: variable 'ansible_search_path' from source: unknown 10896 1726882183.52546: calling self._execute() 10896 1726882183.52654: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.52667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.52722: variable 'omit' from source: magic vars 10896 1726882183.53073: variable 'ansible_distribution_major_version' from source: facts 10896 1726882183.53088: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882183.53274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882183.53557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882183.53704: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882183.53708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882183.53710: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882183.53781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882183.53826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882183.53858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882183.53890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882183.53985: variable '__network_is_ostree' from source: set_fact 10896 1726882183.53999: Evaluated conditional (not __network_is_ostree is defined): False 10896 1726882183.54008: when evaluation is False, skipping this task 10896 1726882183.54015: _execute() done 10896 1726882183.54032: dumping result to json 10896 1726882183.54041: done dumping result, returning 10896 1726882183.54054: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-8b02-b216-000000000497] 10896 1726882183.54063: sending task result for task 12673a56-9f93-8b02-b216-000000000497 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10896 1726882183.54288: no more pending results, returning what we have 10896 1726882183.54294: results queue empty 10896 1726882183.54296: checking for any_errors_fatal 10896 1726882183.54301: done checking for any_errors_fatal 10896 1726882183.54302: checking for max_fail_percentage 10896 1726882183.54305: done checking for max_fail_percentage 10896 1726882183.54306: checking to see if all hosts have failed and the running result is not ok 10896 1726882183.54307: done checking to see if all hosts have failed 10896 1726882183.54308: getting the remaining hosts for this loop 10896 1726882183.54310: done getting the remaining hosts for this loop 10896 1726882183.54313: getting the next task for host managed_node2 10896 1726882183.54323: done getting next task for host managed_node2 10896 1726882183.54327: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10896 1726882183.54333: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882183.54355: getting variables 10896 1726882183.54357: in VariableManager get_vars() 10896 1726882183.54402: Calling all_inventory to load vars for managed_node2 10896 1726882183.54405: Calling groups_inventory to load vars for managed_node2 10896 1726882183.54407: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882183.54418: Calling all_plugins_play to load vars for managed_node2 10896 1726882183.54422: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882183.54426: Calling groups_plugins_play to load vars for managed_node2 10896 1726882183.55107: done sending task result for task 12673a56-9f93-8b02-b216-000000000497 10896 1726882183.55111: WORKER PROCESS EXITING 10896 1726882183.56070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882183.57757: done with get_vars() 10896 1726882183.57782: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:29:43 -0400 (0:00:00.062) 0:00:25.145 ****** 10896 1726882183.57888: entering _queue_task() for managed_node2/service_facts 10896 1726882183.58238: worker is 1 (out of 1 available) 10896 1726882183.58251: exiting _queue_task() for managed_node2/service_facts 10896 1726882183.58264: done queuing things up, now waiting for results queue to drain 10896 1726882183.58266: waiting for pending results... 10896 1726882183.58467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 10896 1726882183.58631: in run() - task 12673a56-9f93-8b02-b216-000000000499 10896 1726882183.58637: variable 'ansible_search_path' from source: unknown 10896 1726882183.58641: variable 'ansible_search_path' from source: unknown 10896 1726882183.58674: calling self._execute() 10896 1726882183.58766: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.58770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.58781: variable 'omit' from source: magic vars 10896 1726882183.59131: variable 'ansible_distribution_major_version' from source: facts 10896 1726882183.59142: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882183.59148: variable 'omit' from source: magic vars 10896 1726882183.59230: variable 'omit' from source: magic vars 10896 1726882183.59264: variable 'omit' from source: magic vars 10896 1726882183.59307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882183.59342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882183.59396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882183.59400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882183.59403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882183.59419: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882183.59422: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.59427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.59613: Set connection var ansible_connection to ssh 10896 1726882183.59616: Set connection var ansible_timeout to 10 10896 1726882183.59620: Set connection var ansible_shell_type to sh 10896 1726882183.59623: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882183.59625: Set connection var ansible_shell_executable to /bin/sh 10896 1726882183.59627: Set connection var ansible_pipelining to False 10896 1726882183.59629: variable 'ansible_shell_executable' from source: unknown 10896 1726882183.59631: variable 'ansible_connection' from source: unknown 10896 1726882183.59634: variable 'ansible_module_compression' from source: unknown 10896 1726882183.59636: variable 'ansible_shell_type' from source: unknown 10896 1726882183.59638: variable 'ansible_shell_executable' from source: unknown 10896 1726882183.59640: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882183.59642: variable 'ansible_pipelining' from source: unknown 10896 1726882183.59644: variable 'ansible_timeout' from source: unknown 10896 1726882183.59646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882183.59792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882183.59806: variable 'omit' from source: magic vars 10896 1726882183.59811: starting attempt loop 10896 1726882183.59814: running the handler 10896 1726882183.59828: _low_level_execute_command(): starting 10896 1726882183.59836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882183.60530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882183.60543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.60554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.60568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882183.60581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882183.60597: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882183.60603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.60619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882183.60703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882183.60706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10896 1726882183.60708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.60710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.60713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882183.60715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882183.60717: stderr chunk (state=3): >>>debug2: match found <<< 10896 1726882183.60719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.60760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.60772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882183.60791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.60884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.62506: stdout chunk (state=3): >>>/root <<< 10896 1726882183.62755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.62758: stdout chunk (state=3): >>><<< 10896 1726882183.62760: stderr chunk (state=3): >>><<< 10896 1726882183.62763: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.62765: _low_level_execute_command(): starting 10896 1726882183.62768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767 `" && echo ansible-tmp-1726882183.6267297-12199-65641297107767="` echo /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767 `" ) && sleep 0' 10896 1726882183.63340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882183.63343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882183.63404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882183.63408: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882183.63410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.63425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.63492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.63523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882183.63564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.63632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.65501: stdout chunk (state=3): >>>ansible-tmp-1726882183.6267297-12199-65641297107767=/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767 <<< 10896 1726882183.65658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.65661: stdout chunk (state=3): >>><<< 10896 1726882183.65664: stderr chunk (state=3): >>><<< 10896 1726882183.65898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882183.6267297-12199-65641297107767=/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.65902: variable 'ansible_module_compression' from source: unknown 10896 1726882183.65904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10896 1726882183.65906: variable 'ansible_facts' from source: unknown 10896 1726882183.65919: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py 10896 1726882183.66155: Sending initial data 10896 1726882183.66158: Sent initial data (161 bytes) 10896 1726882183.66704: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882183.66719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.66735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.66754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882183.66803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.66872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.66908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.67006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.68524: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882183.68608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882183.68690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp_oq8o5gg /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py <<< 10896 1726882183.68695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py" <<< 10896 1726882183.68745: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp_oq8o5gg" to remote "/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py" <<< 10896 1726882183.69558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.69691: stderr chunk (state=3): >>><<< 10896 1726882183.69696: stdout chunk (state=3): >>><<< 10896 1726882183.69707: done transferring module to remote 10896 1726882183.69725: _low_level_execute_command(): starting 10896 1726882183.69734: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/ /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py && sleep 0' 10896 1726882183.70360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882183.70379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.70480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.70513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882183.70531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882183.70551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.70649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882183.72396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882183.72407: stdout chunk (state=3): >>><<< 10896 1726882183.72421: stderr chunk (state=3): >>><<< 10896 1726882183.72513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882183.72516: _low_level_execute_command(): starting 10896 1726882183.72519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/AnsiballZ_service_facts.py && sleep 0' 10896 1726882183.73053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882183.73066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882183.73078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882183.73097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882183.73115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882183.73125: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882183.73148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882183.73245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882183.73268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882183.73374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.23537: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10896 1726882185.24699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.24721: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 10896 1726882185.24771: stderr chunk (state=3): >>><<< 10896 1726882185.24781: stdout chunk (state=3): >>><<< 10896 1726882185.24816: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882185.25806: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882185.25823: _low_level_execute_command(): starting 10896 1726882185.25834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882183.6267297-12199-65641297107767/ > /dev/null 2>&1 && sleep 0' 10896 1726882185.26441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882185.26454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882185.26469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882185.26488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882185.26509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882185.26602: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882185.26625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.26641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.26889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.28578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.28582: stdout chunk (state=3): >>><<< 10896 1726882185.28584: stderr chunk (state=3): >>><<< 10896 1726882185.28701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882185.28705: handler run complete 10896 1726882185.29000: variable 'ansible_facts' from source: unknown 10896 1726882185.29399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882185.30238: variable 'ansible_facts' from source: unknown 10896 1726882185.30501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882185.31003: attempt loop complete, returning result 10896 1726882185.31015: _execute() done 10896 1726882185.31024: dumping result to json 10896 1726882185.31087: done dumping result, returning 10896 1726882185.31299: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-8b02-b216-000000000499] 10896 1726882185.31302: sending task result for task 12673a56-9f93-8b02-b216-000000000499 10896 1726882185.33257: done sending task result for task 12673a56-9f93-8b02-b216-000000000499 10896 1726882185.33262: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882185.33329: no more pending results, returning what we have 10896 1726882185.33331: results queue empty 10896 1726882185.33332: checking for any_errors_fatal 10896 1726882185.33335: done checking for any_errors_fatal 10896 1726882185.33335: checking for max_fail_percentage 10896 1726882185.33337: done checking for max_fail_percentage 10896 1726882185.33338: checking to see if all hosts have failed and the running result is not ok 10896 1726882185.33338: done checking to see if all hosts have failed 10896 1726882185.33339: getting the remaining hosts for this loop 10896 1726882185.33340: done getting the remaining hosts for this loop 10896 1726882185.33343: getting the next task for host managed_node2 10896 1726882185.33399: done getting next task for host managed_node2 10896 1726882185.33403: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10896 1726882185.33410: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882185.33420: getting variables 10896 1726882185.33421: in VariableManager get_vars() 10896 1726882185.33450: Calling all_inventory to load vars for managed_node2 10896 1726882185.33453: Calling groups_inventory to load vars for managed_node2 10896 1726882185.33455: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882185.33463: Calling all_plugins_play to load vars for managed_node2 10896 1726882185.33466: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882185.33469: Calling groups_plugins_play to load vars for managed_node2 10896 1726882185.35836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882185.39138: done with get_vars() 10896 1726882185.39173: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:29:45 -0400 (0:00:01.813) 0:00:26.959 ****** 10896 1726882185.39277: entering _queue_task() for managed_node2/package_facts 10896 1726882185.39719: worker is 1 (out of 1 available) 10896 1726882185.39730: exiting _queue_task() for managed_node2/package_facts 10896 1726882185.39741: done queuing things up, now waiting for results queue to drain 10896 1726882185.39742: waiting for pending results... 10896 1726882185.40028: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10896 1726882185.40219: in run() - task 12673a56-9f93-8b02-b216-00000000049a 10896 1726882185.40222: variable 'ansible_search_path' from source: unknown 10896 1726882185.40225: variable 'ansible_search_path' from source: unknown 10896 1726882185.40261: calling self._execute() 10896 1726882185.40400: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882185.40406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882185.40409: variable 'omit' from source: magic vars 10896 1726882185.40928: variable 'ansible_distribution_major_version' from source: facts 10896 1726882185.40947: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882185.40959: variable 'omit' from source: magic vars 10896 1726882185.41051: variable 'omit' from source: magic vars 10896 1726882185.41118: variable 'omit' from source: magic vars 10896 1726882185.41142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882185.41181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882185.41207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882185.41298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882185.41302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882185.41304: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882185.41306: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882185.41308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882185.41395: Set connection var ansible_connection to ssh 10896 1726882185.41409: Set connection var ansible_timeout to 10 10896 1726882185.41416: Set connection var ansible_shell_type to sh 10896 1726882185.41428: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882185.41440: Set connection var ansible_shell_executable to /bin/sh 10896 1726882185.41453: Set connection var ansible_pipelining to False 10896 1726882185.41480: variable 'ansible_shell_executable' from source: unknown 10896 1726882185.41487: variable 'ansible_connection' from source: unknown 10896 1726882185.41495: variable 'ansible_module_compression' from source: unknown 10896 1726882185.41503: variable 'ansible_shell_type' from source: unknown 10896 1726882185.41509: variable 'ansible_shell_executable' from source: unknown 10896 1726882185.41555: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882185.41558: variable 'ansible_pipelining' from source: unknown 10896 1726882185.41560: variable 'ansible_timeout' from source: unknown 10896 1726882185.41562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882185.41735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882185.41753: variable 'omit' from source: magic vars 10896 1726882185.41761: starting attempt loop 10896 1726882185.41771: running the handler 10896 1726882185.41789: _low_level_execute_command(): starting 10896 1726882185.41880: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882185.42612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882185.42692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882185.42721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.42830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.42892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.44457: stdout chunk (state=3): >>>/root <<< 10896 1726882185.44576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.44584: stdout chunk (state=3): >>><<< 10896 1726882185.44610: stderr chunk (state=3): >>><<< 10896 1726882185.44616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882185.44629: _low_level_execute_command(): starting 10896 1726882185.44635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468 `" && echo ansible-tmp-1726882185.446177-12262-208160450795468="` echo /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468 `" ) && sleep 0' 10896 1726882185.45065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882185.45068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882185.45072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882185.45081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882185.45124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.45181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.45259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.47108: stdout chunk (state=3): >>>ansible-tmp-1726882185.446177-12262-208160450795468=/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468 <<< 10896 1726882185.47273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.47277: stdout chunk (state=3): >>><<< 10896 1726882185.47279: stderr chunk (state=3): >>><<< 10896 1726882185.47306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882185.446177-12262-208160450795468=/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882185.47501: variable 'ansible_module_compression' from source: unknown 10896 1726882185.47504: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10896 1726882185.47506: variable 'ansible_facts' from source: unknown 10896 1726882185.47697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py 10896 1726882185.47846: Sending initial data 10896 1726882185.47959: Sent initial data (161 bytes) 10896 1726882185.48526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882185.48542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882185.48615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882185.48866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882185.48887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.48907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.49154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.50605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882185.50684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882185.50763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpi6q4spxw /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py <<< 10896 1726882185.50795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py" <<< 10896 1726882185.50839: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpi6q4spxw" to remote "/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py" <<< 10896 1726882185.52431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.52490: stderr chunk (state=3): >>><<< 10896 1726882185.52520: stdout chunk (state=3): >>><<< 10896 1726882185.52532: done transferring module to remote 10896 1726882185.52629: _low_level_execute_command(): starting 10896 1726882185.52632: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/ /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py && sleep 0' 10896 1726882185.53201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882185.53218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882185.53244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882185.53262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882185.53278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882185.53289: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882185.53354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882185.53410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882185.53428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.53458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.53556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882185.55399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882185.55403: stdout chunk (state=3): >>><<< 10896 1726882185.55405: stderr chunk (state=3): >>><<< 10896 1726882185.55407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882185.55410: _low_level_execute_command(): starting 10896 1726882185.55412: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/AnsiballZ_package_facts.py && sleep 0' 10896 1726882185.56029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882185.56045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882185.56058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882185.56086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882185.56203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882185.56230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882185.56337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882186.00122: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 10896 1726882186.00145: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 10896 1726882186.00195: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 10896 1726882186.00204: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 10896 1726882186.00211: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 10896 1726882186.00242: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 10896 1726882186.00278: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 10896 1726882186.00286: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 10896 1726882186.00296: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 10896 1726882186.00314: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 10896 1726882186.00320: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 10896 1726882186.00358: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10896 1726882186.02054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882186.02087: stderr chunk (state=3): >>><<< 10896 1726882186.02090: stdout chunk (state=3): >>><<< 10896 1726882186.02139: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882186.03452: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882186.03468: _low_level_execute_command(): starting 10896 1726882186.03472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882185.446177-12262-208160450795468/ > /dev/null 2>&1 && sleep 0' 10896 1726882186.03985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882186.03989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882186.03991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882186.03996: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882186.03999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882186.04052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882186.04056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882186.04058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882186.04125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882186.06003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882186.06006: stdout chunk (state=3): >>><<< 10896 1726882186.06008: stderr chunk (state=3): >>><<< 10896 1726882186.06100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882186.06105: handler run complete 10896 1726882186.06655: variable 'ansible_facts' from source: unknown 10896 1726882186.06969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.08006: variable 'ansible_facts' from source: unknown 10896 1726882186.08500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.09072: attempt loop complete, returning result 10896 1726882186.09089: _execute() done 10896 1726882186.09103: dumping result to json 10896 1726882186.09314: done dumping result, returning 10896 1726882186.09328: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-8b02-b216-00000000049a] 10896 1726882186.09339: sending task result for task 12673a56-9f93-8b02-b216-00000000049a 10896 1726882186.11471: done sending task result for task 12673a56-9f93-8b02-b216-00000000049a 10896 1726882186.11474: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882186.11620: no more pending results, returning what we have 10896 1726882186.11623: results queue empty 10896 1726882186.11624: checking for any_errors_fatal 10896 1726882186.11629: done checking for any_errors_fatal 10896 1726882186.11629: checking for max_fail_percentage 10896 1726882186.11631: done checking for max_fail_percentage 10896 1726882186.11631: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.11632: done checking to see if all hosts have failed 10896 1726882186.11633: getting the remaining hosts for this loop 10896 1726882186.11634: done getting the remaining hosts for this loop 10896 1726882186.11637: getting the next task for host managed_node2 10896 1726882186.11644: done getting next task for host managed_node2 10896 1726882186.11647: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10896 1726882186.11652: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.11662: getting variables 10896 1726882186.11663: in VariableManager get_vars() 10896 1726882186.11698: Calling all_inventory to load vars for managed_node2 10896 1726882186.11701: Calling groups_inventory to load vars for managed_node2 10896 1726882186.11703: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.11712: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.11714: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.11717: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.12892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.14424: done with get_vars() 10896 1726882186.14453: done getting variables 10896 1726882186.14523: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:29:46 -0400 (0:00:00.752) 0:00:27.712 ****** 10896 1726882186.14566: entering _queue_task() for managed_node2/debug 10896 1726882186.14915: worker is 1 (out of 1 available) 10896 1726882186.14929: exiting _queue_task() for managed_node2/debug 10896 1726882186.14943: done queuing things up, now waiting for results queue to drain 10896 1726882186.14944: waiting for pending results... 10896 1726882186.15323: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 10896 1726882186.15401: in run() - task 12673a56-9f93-8b02-b216-00000000007e 10896 1726882186.15428: variable 'ansible_search_path' from source: unknown 10896 1726882186.15437: variable 'ansible_search_path' from source: unknown 10896 1726882186.15480: calling self._execute() 10896 1726882186.15580: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.15601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.15619: variable 'omit' from source: magic vars 10896 1726882186.16019: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.16036: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.16072: variable 'omit' from source: magic vars 10896 1726882186.16120: variable 'omit' from source: magic vars 10896 1726882186.16228: variable 'network_provider' from source: set_fact 10896 1726882186.16253: variable 'omit' from source: magic vars 10896 1726882186.16404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882186.16408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882186.16410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882186.16412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882186.16414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882186.16441: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882186.16447: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.16455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.16553: Set connection var ansible_connection to ssh 10896 1726882186.16565: Set connection var ansible_timeout to 10 10896 1726882186.16571: Set connection var ansible_shell_type to sh 10896 1726882186.16581: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882186.16589: Set connection var ansible_shell_executable to /bin/sh 10896 1726882186.16602: Set connection var ansible_pipelining to False 10896 1726882186.16627: variable 'ansible_shell_executable' from source: unknown 10896 1726882186.16637: variable 'ansible_connection' from source: unknown 10896 1726882186.16643: variable 'ansible_module_compression' from source: unknown 10896 1726882186.16648: variable 'ansible_shell_type' from source: unknown 10896 1726882186.16653: variable 'ansible_shell_executable' from source: unknown 10896 1726882186.16658: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.16664: variable 'ansible_pipelining' from source: unknown 10896 1726882186.16669: variable 'ansible_timeout' from source: unknown 10896 1726882186.16744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.16815: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882186.16832: variable 'omit' from source: magic vars 10896 1726882186.16840: starting attempt loop 10896 1726882186.16845: running the handler 10896 1726882186.16897: handler run complete 10896 1726882186.16915: attempt loop complete, returning result 10896 1726882186.16921: _execute() done 10896 1726882186.16961: dumping result to json 10896 1726882186.16964: done dumping result, returning 10896 1726882186.16966: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-8b02-b216-00000000007e] 10896 1726882186.16968: sending task result for task 12673a56-9f93-8b02-b216-00000000007e 10896 1726882186.17112: done sending task result for task 12673a56-9f93-8b02-b216-00000000007e 10896 1726882186.17116: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 10896 1726882186.17180: no more pending results, returning what we have 10896 1726882186.17183: results queue empty 10896 1726882186.17184: checking for any_errors_fatal 10896 1726882186.17398: done checking for any_errors_fatal 10896 1726882186.17400: checking for max_fail_percentage 10896 1726882186.17402: done checking for max_fail_percentage 10896 1726882186.17403: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.17404: done checking to see if all hosts have failed 10896 1726882186.17405: getting the remaining hosts for this loop 10896 1726882186.17406: done getting the remaining hosts for this loop 10896 1726882186.17410: getting the next task for host managed_node2 10896 1726882186.17416: done getting next task for host managed_node2 10896 1726882186.17420: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10896 1726882186.17423: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.17434: getting variables 10896 1726882186.17436: in VariableManager get_vars() 10896 1726882186.17471: Calling all_inventory to load vars for managed_node2 10896 1726882186.17474: Calling groups_inventory to load vars for managed_node2 10896 1726882186.17476: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.17485: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.17487: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.17491: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.23133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.24707: done with get_vars() 10896 1726882186.24731: done getting variables 10896 1726882186.24773: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:29:46 -0400 (0:00:00.102) 0:00:27.815 ****** 10896 1726882186.24806: entering _queue_task() for managed_node2/fail 10896 1726882186.25154: worker is 1 (out of 1 available) 10896 1726882186.25168: exiting _queue_task() for managed_node2/fail 10896 1726882186.25181: done queuing things up, now waiting for results queue to drain 10896 1726882186.25183: waiting for pending results... 10896 1726882186.25614: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10896 1726882186.25672: in run() - task 12673a56-9f93-8b02-b216-00000000007f 10896 1726882186.25697: variable 'ansible_search_path' from source: unknown 10896 1726882186.25711: variable 'ansible_search_path' from source: unknown 10896 1726882186.25754: calling self._execute() 10896 1726882186.25866: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.25879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.25898: variable 'omit' from source: magic vars 10896 1726882186.26298: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.26316: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.26469: variable 'network_state' from source: role '' defaults 10896 1726882186.26473: Evaluated conditional (network_state != {}): False 10896 1726882186.26476: when evaluation is False, skipping this task 10896 1726882186.26479: _execute() done 10896 1726882186.26481: dumping result to json 10896 1726882186.26483: done dumping result, returning 10896 1726882186.26486: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-8b02-b216-00000000007f] 10896 1726882186.26489: sending task result for task 12673a56-9f93-8b02-b216-00000000007f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882186.26647: no more pending results, returning what we have 10896 1726882186.26651: results queue empty 10896 1726882186.26651: checking for any_errors_fatal 10896 1726882186.26662: done checking for any_errors_fatal 10896 1726882186.26662: checking for max_fail_percentage 10896 1726882186.26664: done checking for max_fail_percentage 10896 1726882186.26665: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.26666: done checking to see if all hosts have failed 10896 1726882186.26667: getting the remaining hosts for this loop 10896 1726882186.26668: done getting the remaining hosts for this loop 10896 1726882186.26671: getting the next task for host managed_node2 10896 1726882186.26678: done getting next task for host managed_node2 10896 1726882186.26682: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10896 1726882186.26685: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.26710: getting variables 10896 1726882186.26711: in VariableManager get_vars() 10896 1726882186.26751: Calling all_inventory to load vars for managed_node2 10896 1726882186.26754: Calling groups_inventory to load vars for managed_node2 10896 1726882186.26757: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.26767: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.26770: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.26772: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.27701: done sending task result for task 12673a56-9f93-8b02-b216-00000000007f 10896 1726882186.27705: WORKER PROCESS EXITING 10896 1726882186.28356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.29982: done with get_vars() 10896 1726882186.30010: done getting variables 10896 1726882186.30069: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:29:46 -0400 (0:00:00.052) 0:00:27.868 ****** 10896 1726882186.30110: entering _queue_task() for managed_node2/fail 10896 1726882186.30437: worker is 1 (out of 1 available) 10896 1726882186.30451: exiting _queue_task() for managed_node2/fail 10896 1726882186.30465: done queuing things up, now waiting for results queue to drain 10896 1726882186.30466: waiting for pending results... 10896 1726882186.30787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10896 1726882186.30963: in run() - task 12673a56-9f93-8b02-b216-000000000080 10896 1726882186.30984: variable 'ansible_search_path' from source: unknown 10896 1726882186.30998: variable 'ansible_search_path' from source: unknown 10896 1726882186.31045: calling self._execute() 10896 1726882186.31154: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.31167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.31183: variable 'omit' from source: magic vars 10896 1726882186.31583: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.31605: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.31729: variable 'network_state' from source: role '' defaults 10896 1726882186.31746: Evaluated conditional (network_state != {}): False 10896 1726882186.31900: when evaluation is False, skipping this task 10896 1726882186.31904: _execute() done 10896 1726882186.31906: dumping result to json 10896 1726882186.31909: done dumping result, returning 10896 1726882186.31912: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-8b02-b216-000000000080] 10896 1726882186.31915: sending task result for task 12673a56-9f93-8b02-b216-000000000080 10896 1726882186.31991: done sending task result for task 12673a56-9f93-8b02-b216-000000000080 10896 1726882186.31998: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882186.32050: no more pending results, returning what we have 10896 1726882186.32054: results queue empty 10896 1726882186.32055: checking for any_errors_fatal 10896 1726882186.32061: done checking for any_errors_fatal 10896 1726882186.32062: checking for max_fail_percentage 10896 1726882186.32064: done checking for max_fail_percentage 10896 1726882186.32066: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.32066: done checking to see if all hosts have failed 10896 1726882186.32067: getting the remaining hosts for this loop 10896 1726882186.32069: done getting the remaining hosts for this loop 10896 1726882186.32073: getting the next task for host managed_node2 10896 1726882186.32081: done getting next task for host managed_node2 10896 1726882186.32085: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10896 1726882186.32090: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.32113: getting variables 10896 1726882186.32115: in VariableManager get_vars() 10896 1726882186.32154: Calling all_inventory to load vars for managed_node2 10896 1726882186.32156: Calling groups_inventory to load vars for managed_node2 10896 1726882186.32158: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.32169: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.32172: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.32174: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.33788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.35377: done with get_vars() 10896 1726882186.35401: done getting variables 10896 1726882186.35458: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:29:46 -0400 (0:00:00.053) 0:00:27.921 ****** 10896 1726882186.35488: entering _queue_task() for managed_node2/fail 10896 1726882186.35767: worker is 1 (out of 1 available) 10896 1726882186.35781: exiting _queue_task() for managed_node2/fail 10896 1726882186.35899: done queuing things up, now waiting for results queue to drain 10896 1726882186.35901: waiting for pending results... 10896 1726882186.36211: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10896 1726882186.36277: in run() - task 12673a56-9f93-8b02-b216-000000000081 10896 1726882186.36300: variable 'ansible_search_path' from source: unknown 10896 1726882186.36310: variable 'ansible_search_path' from source: unknown 10896 1726882186.36353: calling self._execute() 10896 1726882186.36460: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.36472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.36487: variable 'omit' from source: magic vars 10896 1726882186.37002: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.37006: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.37057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.39306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.39382: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.39431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.39469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.39508: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.39586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.39630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.39662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.39716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.39743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.39848: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.39869: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10896 1726882186.39990: variable 'ansible_distribution' from source: facts 10896 1726882186.40005: variable '__network_rh_distros' from source: role '' defaults 10896 1726882186.40022: Evaluated conditional (ansible_distribution in __network_rh_distros): True 10896 1726882186.40283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.40317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.40346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.40398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.40487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.40490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.40502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.40532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.40575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.40603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.40648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.40677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.40714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.40756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.40776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.41103: variable 'network_connections' from source: task vars 10896 1726882186.41135: variable 'port2_profile' from source: play vars 10896 1726882186.41188: variable 'port2_profile' from source: play vars 10896 1726882186.41244: variable 'port1_profile' from source: play vars 10896 1726882186.41273: variable 'port1_profile' from source: play vars 10896 1726882186.41288: variable 'controller_profile' from source: play vars 10896 1726882186.41355: variable 'controller_profile' from source: play vars 10896 1726882186.41369: variable 'network_state' from source: role '' defaults 10896 1726882186.41440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882186.41679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882186.41682: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882186.41708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882186.41741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882186.41803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882186.41829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882186.41858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.41888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882186.42001: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 10896 1726882186.42005: when evaluation is False, skipping this task 10896 1726882186.42007: _execute() done 10896 1726882186.42010: dumping result to json 10896 1726882186.42012: done dumping result, returning 10896 1726882186.42015: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-8b02-b216-000000000081] 10896 1726882186.42018: sending task result for task 12673a56-9f93-8b02-b216-000000000081 10896 1726882186.42086: done sending task result for task 12673a56-9f93-8b02-b216-000000000081 10896 1726882186.42090: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 10896 1726882186.42140: no more pending results, returning what we have 10896 1726882186.42143: results queue empty 10896 1726882186.42144: checking for any_errors_fatal 10896 1726882186.42152: done checking for any_errors_fatal 10896 1726882186.42152: checking for max_fail_percentage 10896 1726882186.42154: done checking for max_fail_percentage 10896 1726882186.42155: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.42156: done checking to see if all hosts have failed 10896 1726882186.42157: getting the remaining hosts for this loop 10896 1726882186.42159: done getting the remaining hosts for this loop 10896 1726882186.42162: getting the next task for host managed_node2 10896 1726882186.42170: done getting next task for host managed_node2 10896 1726882186.42174: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10896 1726882186.42178: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.42200: getting variables 10896 1726882186.42202: in VariableManager get_vars() 10896 1726882186.42243: Calling all_inventory to load vars for managed_node2 10896 1726882186.42246: Calling groups_inventory to load vars for managed_node2 10896 1726882186.42248: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.42258: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.42262: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.42264: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.43839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.45437: done with get_vars() 10896 1726882186.45459: done getting variables 10896 1726882186.45521: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:29:46 -0400 (0:00:00.100) 0:00:28.022 ****** 10896 1726882186.45554: entering _queue_task() for managed_node2/dnf 10896 1726882186.45862: worker is 1 (out of 1 available) 10896 1726882186.45872: exiting _queue_task() for managed_node2/dnf 10896 1726882186.45885: done queuing things up, now waiting for results queue to drain 10896 1726882186.45886: waiting for pending results... 10896 1726882186.46313: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10896 1726882186.46336: in run() - task 12673a56-9f93-8b02-b216-000000000082 10896 1726882186.46354: variable 'ansible_search_path' from source: unknown 10896 1726882186.46363: variable 'ansible_search_path' from source: unknown 10896 1726882186.46407: calling self._execute() 10896 1726882186.46504: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.46522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.46631: variable 'omit' from source: magic vars 10896 1726882186.46905: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.46921: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.47119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.49540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.49591: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.49622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.49648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.49668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.49728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.49749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.49769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.49799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.49811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.49889: variable 'ansible_distribution' from source: facts 10896 1726882186.49894: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.49908: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10896 1726882186.49982: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882186.50067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.50083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.50107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.50133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.50143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.50172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.50188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.50210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.50235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.50245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.50271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.50291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.50313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.50338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.50348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.50447: variable 'network_connections' from source: task vars 10896 1726882186.50457: variable 'port2_profile' from source: play vars 10896 1726882186.50504: variable 'port2_profile' from source: play vars 10896 1726882186.50513: variable 'port1_profile' from source: play vars 10896 1726882186.50555: variable 'port1_profile' from source: play vars 10896 1726882186.50562: variable 'controller_profile' from source: play vars 10896 1726882186.50609: variable 'controller_profile' from source: play vars 10896 1726882186.50655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882186.50765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882186.50805: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882186.50829: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882186.50851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882186.50881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882186.50903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882186.50921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.50940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882186.50990: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882186.51279: variable 'network_connections' from source: task vars 10896 1726882186.51282: variable 'port2_profile' from source: play vars 10896 1726882186.51284: variable 'port2_profile' from source: play vars 10896 1726882186.51309: variable 'port1_profile' from source: play vars 10896 1726882186.51379: variable 'port1_profile' from source: play vars 10896 1726882186.51396: variable 'controller_profile' from source: play vars 10896 1726882186.51458: variable 'controller_profile' from source: play vars 10896 1726882186.51492: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882186.51503: when evaluation is False, skipping this task 10896 1726882186.51516: _execute() done 10896 1726882186.51622: dumping result to json 10896 1726882186.51625: done dumping result, returning 10896 1726882186.51627: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-8b02-b216-000000000082] 10896 1726882186.51629: sending task result for task 12673a56-9f93-8b02-b216-000000000082 10896 1726882186.51700: done sending task result for task 12673a56-9f93-8b02-b216-000000000082 10896 1726882186.51704: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882186.51760: no more pending results, returning what we have 10896 1726882186.51763: results queue empty 10896 1726882186.51764: checking for any_errors_fatal 10896 1726882186.51770: done checking for any_errors_fatal 10896 1726882186.51771: checking for max_fail_percentage 10896 1726882186.51773: done checking for max_fail_percentage 10896 1726882186.51774: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.51775: done checking to see if all hosts have failed 10896 1726882186.51775: getting the remaining hosts for this loop 10896 1726882186.51777: done getting the remaining hosts for this loop 10896 1726882186.51781: getting the next task for host managed_node2 10896 1726882186.51789: done getting next task for host managed_node2 10896 1726882186.51842: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10896 1726882186.51848: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.51867: getting variables 10896 1726882186.51869: in VariableManager get_vars() 10896 1726882186.51913: Calling all_inventory to load vars for managed_node2 10896 1726882186.51916: Calling groups_inventory to load vars for managed_node2 10896 1726882186.51918: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.51927: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.51931: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.51933: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.52887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.53753: done with get_vars() 10896 1726882186.53768: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10896 1726882186.53823: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:29:46 -0400 (0:00:00.082) 0:00:28.105 ****** 10896 1726882186.53847: entering _queue_task() for managed_node2/yum 10896 1726882186.54085: worker is 1 (out of 1 available) 10896 1726882186.54103: exiting _queue_task() for managed_node2/yum 10896 1726882186.54115: done queuing things up, now waiting for results queue to drain 10896 1726882186.54116: waiting for pending results... 10896 1726882186.54541: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10896 1726882186.54545: in run() - task 12673a56-9f93-8b02-b216-000000000083 10896 1726882186.54548: variable 'ansible_search_path' from source: unknown 10896 1726882186.54550: variable 'ansible_search_path' from source: unknown 10896 1726882186.54586: calling self._execute() 10896 1726882186.54685: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.54701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.54716: variable 'omit' from source: magic vars 10896 1726882186.55098: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.55116: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.55295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.57595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.57684: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.57727: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.57765: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.57803: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.57899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.57923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.58000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.58004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.58016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.58125: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.58145: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10896 1726882186.58153: when evaluation is False, skipping this task 10896 1726882186.58160: _execute() done 10896 1726882186.58216: dumping result to json 10896 1726882186.58224: done dumping result, returning 10896 1726882186.58227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-8b02-b216-000000000083] 10896 1726882186.58229: sending task result for task 12673a56-9f93-8b02-b216-000000000083 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10896 1726882186.58362: no more pending results, returning what we have 10896 1726882186.58364: results queue empty 10896 1726882186.58365: checking for any_errors_fatal 10896 1726882186.58370: done checking for any_errors_fatal 10896 1726882186.58370: checking for max_fail_percentage 10896 1726882186.58372: done checking for max_fail_percentage 10896 1726882186.58373: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.58374: done checking to see if all hosts have failed 10896 1726882186.58374: getting the remaining hosts for this loop 10896 1726882186.58376: done getting the remaining hosts for this loop 10896 1726882186.58379: getting the next task for host managed_node2 10896 1726882186.58386: done getting next task for host managed_node2 10896 1726882186.58389: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10896 1726882186.58397: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.58423: getting variables 10896 1726882186.58424: in VariableManager get_vars() 10896 1726882186.58463: Calling all_inventory to load vars for managed_node2 10896 1726882186.58466: Calling groups_inventory to load vars for managed_node2 10896 1726882186.58468: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.58478: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.58481: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.58483: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.59065: done sending task result for task 12673a56-9f93-8b02-b216-000000000083 10896 1726882186.59068: WORKER PROCESS EXITING 10896 1726882186.60075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.63382: done with get_vars() 10896 1726882186.63407: done getting variables 10896 1726882186.63473: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:29:46 -0400 (0:00:00.096) 0:00:28.202 ****** 10896 1726882186.63513: entering _queue_task() for managed_node2/fail 10896 1726882186.64249: worker is 1 (out of 1 available) 10896 1726882186.64262: exiting _queue_task() for managed_node2/fail 10896 1726882186.64273: done queuing things up, now waiting for results queue to drain 10896 1726882186.64274: waiting for pending results... 10896 1726882186.64891: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10896 1726882186.65302: in run() - task 12673a56-9f93-8b02-b216-000000000084 10896 1726882186.65306: variable 'ansible_search_path' from source: unknown 10896 1726882186.65309: variable 'ansible_search_path' from source: unknown 10896 1726882186.65312: calling self._execute() 10896 1726882186.65412: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.65417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.65420: variable 'omit' from source: magic vars 10896 1726882186.66280: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.66284: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.66564: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882186.66854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.69887: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.69962: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.70040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.70044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.70054: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.70135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.70165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.70201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.70250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.70500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.70504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.70507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.70509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.70512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.70515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.70517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.70519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.70535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.70578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.70603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.70775: variable 'network_connections' from source: task vars 10896 1726882186.70791: variable 'port2_profile' from source: play vars 10896 1726882186.70862: variable 'port2_profile' from source: play vars 10896 1726882186.70878: variable 'port1_profile' from source: play vars 10896 1726882186.70942: variable 'port1_profile' from source: play vars 10896 1726882186.70956: variable 'controller_profile' from source: play vars 10896 1726882186.71018: variable 'controller_profile' from source: play vars 10896 1726882186.71092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882186.71272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882186.71320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882186.71354: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882186.71386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882186.71434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882186.71462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882186.71491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.71526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882186.71580: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882186.71820: variable 'network_connections' from source: task vars 10896 1726882186.71830: variable 'port2_profile' from source: play vars 10896 1726882186.71896: variable 'port2_profile' from source: play vars 10896 1726882186.71910: variable 'port1_profile' from source: play vars 10896 1726882186.71970: variable 'port1_profile' from source: play vars 10896 1726882186.71983: variable 'controller_profile' from source: play vars 10896 1726882186.72047: variable 'controller_profile' from source: play vars 10896 1726882186.72074: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882186.72091: when evaluation is False, skipping this task 10896 1726882186.72102: _execute() done 10896 1726882186.72110: dumping result to json 10896 1726882186.72117: done dumping result, returning 10896 1726882186.72127: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-8b02-b216-000000000084] 10896 1726882186.72136: sending task result for task 12673a56-9f93-8b02-b216-000000000084 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882186.72281: no more pending results, returning what we have 10896 1726882186.72283: results queue empty 10896 1726882186.72284: checking for any_errors_fatal 10896 1726882186.72290: done checking for any_errors_fatal 10896 1726882186.72290: checking for max_fail_percentage 10896 1726882186.72292: done checking for max_fail_percentage 10896 1726882186.72295: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.72296: done checking to see if all hosts have failed 10896 1726882186.72296: getting the remaining hosts for this loop 10896 1726882186.72298: done getting the remaining hosts for this loop 10896 1726882186.72301: getting the next task for host managed_node2 10896 1726882186.72308: done getting next task for host managed_node2 10896 1726882186.72312: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10896 1726882186.72316: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.72513: getting variables 10896 1726882186.72515: in VariableManager get_vars() 10896 1726882186.72550: Calling all_inventory to load vars for managed_node2 10896 1726882186.72553: Calling groups_inventory to load vars for managed_node2 10896 1726882186.72555: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.72564: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.72566: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.72569: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.73088: done sending task result for task 12673a56-9f93-8b02-b216-000000000084 10896 1726882186.73091: WORKER PROCESS EXITING 10896 1726882186.73891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.75465: done with get_vars() 10896 1726882186.75490: done getting variables 10896 1726882186.75552: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:29:46 -0400 (0:00:00.120) 0:00:28.322 ****** 10896 1726882186.75588: entering _queue_task() for managed_node2/package 10896 1726882186.76029: worker is 1 (out of 1 available) 10896 1726882186.76041: exiting _queue_task() for managed_node2/package 10896 1726882186.76049: done queuing things up, now waiting for results queue to drain 10896 1726882186.76051: waiting for pending results... 10896 1726882186.76411: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 10896 1726882186.76416: in run() - task 12673a56-9f93-8b02-b216-000000000085 10896 1726882186.76419: variable 'ansible_search_path' from source: unknown 10896 1726882186.76422: variable 'ansible_search_path' from source: unknown 10896 1726882186.76455: calling self._execute() 10896 1726882186.76552: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.76563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.76576: variable 'omit' from source: magic vars 10896 1726882186.76949: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.76969: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.77162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882186.77434: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882186.77485: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882186.77531: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882186.77607: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882186.77722: variable 'network_packages' from source: role '' defaults 10896 1726882186.77833: variable '__network_provider_setup' from source: role '' defaults 10896 1726882186.77853: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882186.77919: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882186.77951: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882186.78000: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882186.78196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.80127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.80172: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.80197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.80226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.80244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.80309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.80331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.80349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.80375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.80388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.80424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.80441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.80458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.80483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.80499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.80641: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10896 1726882186.80728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.80744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.80761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.80786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.80800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.80876: variable 'ansible_python' from source: facts 10896 1726882186.80903: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10896 1726882186.80997: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882186.81074: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882186.81404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.81407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.81410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.81412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.81415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.81417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.81427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.81429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.81432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.81434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.81549: variable 'network_connections' from source: task vars 10896 1726882186.81555: variable 'port2_profile' from source: play vars 10896 1726882186.81660: variable 'port2_profile' from source: play vars 10896 1726882186.81672: variable 'port1_profile' from source: play vars 10896 1726882186.81774: variable 'port1_profile' from source: play vars 10896 1726882186.81784: variable 'controller_profile' from source: play vars 10896 1726882186.81877: variable 'controller_profile' from source: play vars 10896 1726882186.81951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882186.81984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882186.82011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.82033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882186.82082: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882186.82400: variable 'network_connections' from source: task vars 10896 1726882186.82404: variable 'port2_profile' from source: play vars 10896 1726882186.82437: variable 'port2_profile' from source: play vars 10896 1726882186.82447: variable 'port1_profile' from source: play vars 10896 1726882186.82543: variable 'port1_profile' from source: play vars 10896 1726882186.82553: variable 'controller_profile' from source: play vars 10896 1726882186.82652: variable 'controller_profile' from source: play vars 10896 1726882186.82682: variable '__network_packages_default_wireless' from source: role '' defaults 10896 1726882186.82799: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882186.83046: variable 'network_connections' from source: task vars 10896 1726882186.83049: variable 'port2_profile' from source: play vars 10896 1726882186.83166: variable 'port2_profile' from source: play vars 10896 1726882186.83169: variable 'port1_profile' from source: play vars 10896 1726882186.83176: variable 'port1_profile' from source: play vars 10896 1726882186.83184: variable 'controller_profile' from source: play vars 10896 1726882186.83256: variable 'controller_profile' from source: play vars 10896 1726882186.83270: variable '__network_packages_default_team' from source: role '' defaults 10896 1726882186.83353: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882186.83554: variable 'network_connections' from source: task vars 10896 1726882186.83557: variable 'port2_profile' from source: play vars 10896 1726882186.83607: variable 'port2_profile' from source: play vars 10896 1726882186.83614: variable 'port1_profile' from source: play vars 10896 1726882186.83658: variable 'port1_profile' from source: play vars 10896 1726882186.83664: variable 'controller_profile' from source: play vars 10896 1726882186.83716: variable 'controller_profile' from source: play vars 10896 1726882186.83750: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882186.83792: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882186.83804: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882186.83844: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882186.83977: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10896 1726882186.84276: variable 'network_connections' from source: task vars 10896 1726882186.84279: variable 'port2_profile' from source: play vars 10896 1726882186.84325: variable 'port2_profile' from source: play vars 10896 1726882186.84331: variable 'port1_profile' from source: play vars 10896 1726882186.84374: variable 'port1_profile' from source: play vars 10896 1726882186.84381: variable 'controller_profile' from source: play vars 10896 1726882186.84425: variable 'controller_profile' from source: play vars 10896 1726882186.84432: variable 'ansible_distribution' from source: facts 10896 1726882186.84437: variable '__network_rh_distros' from source: role '' defaults 10896 1726882186.84442: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.84455: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10896 1726882186.84560: variable 'ansible_distribution' from source: facts 10896 1726882186.84564: variable '__network_rh_distros' from source: role '' defaults 10896 1726882186.84566: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.84581: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10896 1726882186.84687: variable 'ansible_distribution' from source: facts 10896 1726882186.84694: variable '__network_rh_distros' from source: role '' defaults 10896 1726882186.84697: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.84720: variable 'network_provider' from source: set_fact 10896 1726882186.84731: variable 'ansible_facts' from source: unknown 10896 1726882186.85152: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10896 1726882186.85156: when evaluation is False, skipping this task 10896 1726882186.85158: _execute() done 10896 1726882186.85161: dumping result to json 10896 1726882186.85163: done dumping result, returning 10896 1726882186.85165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-8b02-b216-000000000085] 10896 1726882186.85168: sending task result for task 12673a56-9f93-8b02-b216-000000000085 10896 1726882186.85326: done sending task result for task 12673a56-9f93-8b02-b216-000000000085 10896 1726882186.85329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10896 1726882186.85379: no more pending results, returning what we have 10896 1726882186.85382: results queue empty 10896 1726882186.85383: checking for any_errors_fatal 10896 1726882186.85389: done checking for any_errors_fatal 10896 1726882186.85389: checking for max_fail_percentage 10896 1726882186.85391: done checking for max_fail_percentage 10896 1726882186.85392: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.85395: done checking to see if all hosts have failed 10896 1726882186.85395: getting the remaining hosts for this loop 10896 1726882186.85397: done getting the remaining hosts for this loop 10896 1726882186.85406: getting the next task for host managed_node2 10896 1726882186.85413: done getting next task for host managed_node2 10896 1726882186.85417: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10896 1726882186.85421: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.85441: getting variables 10896 1726882186.85442: in VariableManager get_vars() 10896 1726882186.85486: Calling all_inventory to load vars for managed_node2 10896 1726882186.85489: Calling groups_inventory to load vars for managed_node2 10896 1726882186.85492: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.85524: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.85528: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.85531: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.86764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.87617: done with get_vars() 10896 1726882186.87633: done getting variables 10896 1726882186.87673: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:29:46 -0400 (0:00:00.121) 0:00:28.443 ****** 10896 1726882186.87702: entering _queue_task() for managed_node2/package 10896 1726882186.87920: worker is 1 (out of 1 available) 10896 1726882186.87933: exiting _queue_task() for managed_node2/package 10896 1726882186.87946: done queuing things up, now waiting for results queue to drain 10896 1726882186.87947: waiting for pending results... 10896 1726882186.88122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10896 1726882186.88222: in run() - task 12673a56-9f93-8b02-b216-000000000086 10896 1726882186.88234: variable 'ansible_search_path' from source: unknown 10896 1726882186.88237: variable 'ansible_search_path' from source: unknown 10896 1726882186.88264: calling self._execute() 10896 1726882186.88337: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.88341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.88350: variable 'omit' from source: magic vars 10896 1726882186.88615: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.88624: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.88704: variable 'network_state' from source: role '' defaults 10896 1726882186.88715: Evaluated conditional (network_state != {}): False 10896 1726882186.88718: when evaluation is False, skipping this task 10896 1726882186.88721: _execute() done 10896 1726882186.88723: dumping result to json 10896 1726882186.88727: done dumping result, returning 10896 1726882186.88737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-8b02-b216-000000000086] 10896 1726882186.88740: sending task result for task 12673a56-9f93-8b02-b216-000000000086 10896 1726882186.88827: done sending task result for task 12673a56-9f93-8b02-b216-000000000086 10896 1726882186.88829: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882186.88880: no more pending results, returning what we have 10896 1726882186.88883: results queue empty 10896 1726882186.88884: checking for any_errors_fatal 10896 1726882186.88889: done checking for any_errors_fatal 10896 1726882186.88890: checking for max_fail_percentage 10896 1726882186.88891: done checking for max_fail_percentage 10896 1726882186.88892: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.88896: done checking to see if all hosts have failed 10896 1726882186.88897: getting the remaining hosts for this loop 10896 1726882186.88899: done getting the remaining hosts for this loop 10896 1726882186.88902: getting the next task for host managed_node2 10896 1726882186.88908: done getting next task for host managed_node2 10896 1726882186.88912: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10896 1726882186.88916: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.88932: getting variables 10896 1726882186.88933: in VariableManager get_vars() 10896 1726882186.88966: Calling all_inventory to load vars for managed_node2 10896 1726882186.88968: Calling groups_inventory to load vars for managed_node2 10896 1726882186.88970: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.88978: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.88980: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.88983: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.89710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.90575: done with get_vars() 10896 1726882186.90589: done getting variables 10896 1726882186.90638: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:29:46 -0400 (0:00:00.029) 0:00:28.473 ****** 10896 1726882186.90662: entering _queue_task() for managed_node2/package 10896 1726882186.90859: worker is 1 (out of 1 available) 10896 1726882186.90872: exiting _queue_task() for managed_node2/package 10896 1726882186.90883: done queuing things up, now waiting for results queue to drain 10896 1726882186.90884: waiting for pending results... 10896 1726882186.91054: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10896 1726882186.91145: in run() - task 12673a56-9f93-8b02-b216-000000000087 10896 1726882186.91156: variable 'ansible_search_path' from source: unknown 10896 1726882186.91160: variable 'ansible_search_path' from source: unknown 10896 1726882186.91186: calling self._execute() 10896 1726882186.91255: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.91258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.91267: variable 'omit' from source: magic vars 10896 1726882186.91524: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.91533: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.91618: variable 'network_state' from source: role '' defaults 10896 1726882186.91626: Evaluated conditional (network_state != {}): False 10896 1726882186.91629: when evaluation is False, skipping this task 10896 1726882186.91632: _execute() done 10896 1726882186.91635: dumping result to json 10896 1726882186.91637: done dumping result, returning 10896 1726882186.91644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-8b02-b216-000000000087] 10896 1726882186.91650: sending task result for task 12673a56-9f93-8b02-b216-000000000087 10896 1726882186.91740: done sending task result for task 12673a56-9f93-8b02-b216-000000000087 10896 1726882186.91743: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882186.91808: no more pending results, returning what we have 10896 1726882186.91811: results queue empty 10896 1726882186.91812: checking for any_errors_fatal 10896 1726882186.91816: done checking for any_errors_fatal 10896 1726882186.91816: checking for max_fail_percentage 10896 1726882186.91818: done checking for max_fail_percentage 10896 1726882186.91819: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.91819: done checking to see if all hosts have failed 10896 1726882186.91820: getting the remaining hosts for this loop 10896 1726882186.91821: done getting the remaining hosts for this loop 10896 1726882186.91824: getting the next task for host managed_node2 10896 1726882186.91830: done getting next task for host managed_node2 10896 1726882186.91832: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10896 1726882186.91836: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.91851: getting variables 10896 1726882186.91853: in VariableManager get_vars() 10896 1726882186.91885: Calling all_inventory to load vars for managed_node2 10896 1726882186.91887: Calling groups_inventory to load vars for managed_node2 10896 1726882186.91889: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.91899: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.91902: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.91904: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.92718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.93570: done with get_vars() 10896 1726882186.93584: done getting variables 10896 1726882186.93630: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:29:46 -0400 (0:00:00.029) 0:00:28.503 ****** 10896 1726882186.93654: entering _queue_task() for managed_node2/service 10896 1726882186.93844: worker is 1 (out of 1 available) 10896 1726882186.93856: exiting _queue_task() for managed_node2/service 10896 1726882186.93867: done queuing things up, now waiting for results queue to drain 10896 1726882186.93869: waiting for pending results... 10896 1726882186.94034: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10896 1726882186.94130: in run() - task 12673a56-9f93-8b02-b216-000000000088 10896 1726882186.94141: variable 'ansible_search_path' from source: unknown 10896 1726882186.94144: variable 'ansible_search_path' from source: unknown 10896 1726882186.94170: calling self._execute() 10896 1726882186.94240: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882186.94244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882186.94252: variable 'omit' from source: magic vars 10896 1726882186.94503: variable 'ansible_distribution_major_version' from source: facts 10896 1726882186.94512: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882186.94590: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882186.94719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882186.96183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882186.96238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882186.96267: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882186.96298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882186.96317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882186.96372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.96400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.96417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.96444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.96454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.96488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.96510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.96526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.96550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.96560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.96587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882186.96612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882186.96625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.96649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882186.96659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882186.96773: variable 'network_connections' from source: task vars 10896 1726882186.96783: variable 'port2_profile' from source: play vars 10896 1726882186.96838: variable 'port2_profile' from source: play vars 10896 1726882186.96847: variable 'port1_profile' from source: play vars 10896 1726882186.96889: variable 'port1_profile' from source: play vars 10896 1726882186.96898: variable 'controller_profile' from source: play vars 10896 1726882186.96942: variable 'controller_profile' from source: play vars 10896 1726882186.96989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882186.97111: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882186.97139: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882186.97163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882186.97185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882186.97217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882186.97234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882186.97254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882186.97272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882186.97313: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882186.97460: variable 'network_connections' from source: task vars 10896 1726882186.97463: variable 'port2_profile' from source: play vars 10896 1726882186.97511: variable 'port2_profile' from source: play vars 10896 1726882186.97517: variable 'port1_profile' from source: play vars 10896 1726882186.97558: variable 'port1_profile' from source: play vars 10896 1726882186.97564: variable 'controller_profile' from source: play vars 10896 1726882186.97610: variable 'controller_profile' from source: play vars 10896 1726882186.97629: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10896 1726882186.97642: when evaluation is False, skipping this task 10896 1726882186.97644: _execute() done 10896 1726882186.97648: dumping result to json 10896 1726882186.97650: done dumping result, returning 10896 1726882186.97652: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-8b02-b216-000000000088] 10896 1726882186.97654: sending task result for task 12673a56-9f93-8b02-b216-000000000088 10896 1726882186.97740: done sending task result for task 12673a56-9f93-8b02-b216-000000000088 10896 1726882186.97743: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10896 1726882186.97783: no more pending results, returning what we have 10896 1726882186.97785: results queue empty 10896 1726882186.97786: checking for any_errors_fatal 10896 1726882186.97796: done checking for any_errors_fatal 10896 1726882186.97797: checking for max_fail_percentage 10896 1726882186.97799: done checking for max_fail_percentage 10896 1726882186.97800: checking to see if all hosts have failed and the running result is not ok 10896 1726882186.97801: done checking to see if all hosts have failed 10896 1726882186.97801: getting the remaining hosts for this loop 10896 1726882186.97803: done getting the remaining hosts for this loop 10896 1726882186.97806: getting the next task for host managed_node2 10896 1726882186.97813: done getting next task for host managed_node2 10896 1726882186.97817: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10896 1726882186.97820: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882186.97839: getting variables 10896 1726882186.97840: in VariableManager get_vars() 10896 1726882186.97879: Calling all_inventory to load vars for managed_node2 10896 1726882186.97882: Calling groups_inventory to load vars for managed_node2 10896 1726882186.97884: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882186.97901: Calling all_plugins_play to load vars for managed_node2 10896 1726882186.97905: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882186.97908: Calling groups_plugins_play to load vars for managed_node2 10896 1726882186.98704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882186.99686: done with get_vars() 10896 1726882186.99707: done getting variables 10896 1726882186.99750: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:29:46 -0400 (0:00:00.061) 0:00:28.564 ****** 10896 1726882186.99774: entering _queue_task() for managed_node2/service 10896 1726882187.00102: worker is 1 (out of 1 available) 10896 1726882187.00115: exiting _queue_task() for managed_node2/service 10896 1726882187.00126: done queuing things up, now waiting for results queue to drain 10896 1726882187.00128: waiting for pending results... 10896 1726882187.00312: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10896 1726882187.00402: in run() - task 12673a56-9f93-8b02-b216-000000000089 10896 1726882187.00413: variable 'ansible_search_path' from source: unknown 10896 1726882187.00416: variable 'ansible_search_path' from source: unknown 10896 1726882187.00447: calling self._execute() 10896 1726882187.00520: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.00523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.00533: variable 'omit' from source: magic vars 10896 1726882187.00801: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.00810: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882187.00917: variable 'network_provider' from source: set_fact 10896 1726882187.00921: variable 'network_state' from source: role '' defaults 10896 1726882187.00929: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10896 1726882187.00934: variable 'omit' from source: magic vars 10896 1726882187.00975: variable 'omit' from source: magic vars 10896 1726882187.01002: variable 'network_service_name' from source: role '' defaults 10896 1726882187.01043: variable 'network_service_name' from source: role '' defaults 10896 1726882187.01119: variable '__network_provider_setup' from source: role '' defaults 10896 1726882187.01124: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882187.01167: variable '__network_service_name_default_nm' from source: role '' defaults 10896 1726882187.01175: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882187.01222: variable '__network_packages_default_nm' from source: role '' defaults 10896 1726882187.01392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882187.03439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882187.03492: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882187.03521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882187.03545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882187.03567: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882187.03626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.03647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.03666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.03698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.03708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.03740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.03756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.03773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.03804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.03815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.03959: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10896 1726882187.04039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.04055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.04072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.04099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.04114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.04172: variable 'ansible_python' from source: facts 10896 1726882187.04188: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10896 1726882187.04248: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882187.04301: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882187.04381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.04400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.04417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.04446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.04457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.04489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.04511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.04528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.04556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.04566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.04660: variable 'network_connections' from source: task vars 10896 1726882187.04667: variable 'port2_profile' from source: play vars 10896 1726882187.04720: variable 'port2_profile' from source: play vars 10896 1726882187.04730: variable 'port1_profile' from source: play vars 10896 1726882187.04782: variable 'port1_profile' from source: play vars 10896 1726882187.04898: variable 'controller_profile' from source: play vars 10896 1726882187.04901: variable 'controller_profile' from source: play vars 10896 1726882187.04978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882187.05168: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882187.05227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882187.05272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882187.05319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882187.05380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882187.05418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882187.05455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.05490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882187.05543: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882187.05809: variable 'network_connections' from source: task vars 10896 1726882187.05821: variable 'port2_profile' from source: play vars 10896 1726882187.05895: variable 'port2_profile' from source: play vars 10896 1726882187.05912: variable 'port1_profile' from source: play vars 10896 1726882187.05985: variable 'port1_profile' from source: play vars 10896 1726882187.06006: variable 'controller_profile' from source: play vars 10896 1726882187.06078: variable 'controller_profile' from source: play vars 10896 1726882187.06113: variable '__network_packages_default_wireless' from source: role '' defaults 10896 1726882187.06187: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882187.06432: variable 'network_connections' from source: task vars 10896 1726882187.06435: variable 'port2_profile' from source: play vars 10896 1726882187.06502: variable 'port2_profile' from source: play vars 10896 1726882187.06513: variable 'port1_profile' from source: play vars 10896 1726882187.06573: variable 'port1_profile' from source: play vars 10896 1726882187.06580: variable 'controller_profile' from source: play vars 10896 1726882187.06646: variable 'controller_profile' from source: play vars 10896 1726882187.06668: variable '__network_packages_default_team' from source: role '' defaults 10896 1726882187.06741: variable '__network_team_connections_defined' from source: role '' defaults 10896 1726882187.07004: variable 'network_connections' from source: task vars 10896 1726882187.07008: variable 'port2_profile' from source: play vars 10896 1726882187.07074: variable 'port2_profile' from source: play vars 10896 1726882187.07081: variable 'port1_profile' from source: play vars 10896 1726882187.07148: variable 'port1_profile' from source: play vars 10896 1726882187.07155: variable 'controller_profile' from source: play vars 10896 1726882187.07278: variable 'controller_profile' from source: play vars 10896 1726882187.07281: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882187.07331: variable '__network_service_name_default_initscripts' from source: role '' defaults 10896 1726882187.07337: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882187.07397: variable '__network_packages_default_initscripts' from source: role '' defaults 10896 1726882187.07604: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10896 1726882187.08298: variable 'network_connections' from source: task vars 10896 1726882187.08301: variable 'port2_profile' from source: play vars 10896 1726882187.08304: variable 'port2_profile' from source: play vars 10896 1726882187.08306: variable 'port1_profile' from source: play vars 10896 1726882187.08308: variable 'port1_profile' from source: play vars 10896 1726882187.08310: variable 'controller_profile' from source: play vars 10896 1726882187.08312: variable 'controller_profile' from source: play vars 10896 1726882187.08313: variable 'ansible_distribution' from source: facts 10896 1726882187.08315: variable '__network_rh_distros' from source: role '' defaults 10896 1726882187.08317: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.08319: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10896 1726882187.08453: variable 'ansible_distribution' from source: facts 10896 1726882187.08461: variable '__network_rh_distros' from source: role '' defaults 10896 1726882187.08469: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.08484: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10896 1726882187.08638: variable 'ansible_distribution' from source: facts 10896 1726882187.08647: variable '__network_rh_distros' from source: role '' defaults 10896 1726882187.08657: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.08699: variable 'network_provider' from source: set_fact 10896 1726882187.08728: variable 'omit' from source: magic vars 10896 1726882187.08758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882187.08785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882187.08821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882187.08843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882187.08857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882187.08888: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882187.08899: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.08906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.09008: Set connection var ansible_connection to ssh 10896 1726882187.09020: Set connection var ansible_timeout to 10 10896 1726882187.09028: Set connection var ansible_shell_type to sh 10896 1726882187.09035: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882187.09044: Set connection var ansible_shell_executable to /bin/sh 10896 1726882187.09052: Set connection var ansible_pipelining to False 10896 1726882187.09080: variable 'ansible_shell_executable' from source: unknown 10896 1726882187.09088: variable 'ansible_connection' from source: unknown 10896 1726882187.09096: variable 'ansible_module_compression' from source: unknown 10896 1726882187.09104: variable 'ansible_shell_type' from source: unknown 10896 1726882187.09110: variable 'ansible_shell_executable' from source: unknown 10896 1726882187.09116: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.09124: variable 'ansible_pipelining' from source: unknown 10896 1726882187.09131: variable 'ansible_timeout' from source: unknown 10896 1726882187.09139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.09240: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882187.09258: variable 'omit' from source: magic vars 10896 1726882187.09267: starting attempt loop 10896 1726882187.09274: running the handler 10896 1726882187.09355: variable 'ansible_facts' from source: unknown 10896 1726882187.10077: _low_level_execute_command(): starting 10896 1726882187.10083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882187.10746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882187.10758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882187.10770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.10784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882187.10804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882187.10818: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882187.10833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.10851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882187.10914: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.10954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.10971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.10997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.11105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.12812: stdout chunk (state=3): >>>/root <<< 10896 1726882187.12933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.12945: stdout chunk (state=3): >>><<< 10896 1726882187.12956: stderr chunk (state=3): >>><<< 10896 1726882187.12976: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.12987: _low_level_execute_command(): starting 10896 1726882187.12995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334 `" && echo ansible-tmp-1726882187.1297617-12315-195437884225334="` echo /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334 `" ) && sleep 0' 10896 1726882187.13387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.13422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882187.13425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882187.13428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.13471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.13474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.13546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.15476: stdout chunk (state=3): >>>ansible-tmp-1726882187.1297617-12315-195437884225334=/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334 <<< 10896 1726882187.15578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.15604: stderr chunk (state=3): >>><<< 10896 1726882187.15607: stdout chunk (state=3): >>><<< 10896 1726882187.15622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882187.1297617-12315-195437884225334=/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.15648: variable 'ansible_module_compression' from source: unknown 10896 1726882187.15687: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10896 1726882187.15740: variable 'ansible_facts' from source: unknown 10896 1726882187.15872: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py 10896 1726882187.15970: Sending initial data 10896 1726882187.15974: Sent initial data (156 bytes) 10896 1726882187.16371: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882187.16379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882187.16401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.16404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882187.16414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.16459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.16475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.16537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.18138: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882187.18220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882187.18311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpzuw2bo5c /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py <<< 10896 1726882187.18315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py" <<< 10896 1726882187.18356: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpzuw2bo5c" to remote "/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py" <<< 10896 1726882187.20035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.20211: stderr chunk (state=3): >>><<< 10896 1726882187.20214: stdout chunk (state=3): >>><<< 10896 1726882187.20216: done transferring module to remote 10896 1726882187.20218: _low_level_execute_command(): starting 10896 1726882187.20220: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/ /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py && sleep 0' 10896 1726882187.20856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882187.20860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.20932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.20964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.20980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.21086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.22981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.22998: stderr chunk (state=3): >>><<< 10896 1726882187.23007: stdout chunk (state=3): >>><<< 10896 1726882187.23027: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.23036: _low_level_execute_command(): starting 10896 1726882187.23121: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/AnsiballZ_systemd.py && sleep 0' 10896 1726882187.23665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882187.23678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882187.23692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.23750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.23819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.23849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.23889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.23961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.53229: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4452352", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312906240", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "477023000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 10896 1726882187.53235: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10896 1726882187.55105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882187.55109: stdout chunk (state=3): >>><<< 10896 1726882187.55112: stderr chunk (state=3): >>><<< 10896 1726882187.55129: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4452352", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312906240", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "477023000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882187.55248: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882187.55262: _low_level_execute_command(): starting 10896 1726882187.55266: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882187.1297617-12315-195437884225334/ > /dev/null 2>&1 && sleep 0' 10896 1726882187.55676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.55680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882187.55712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.55715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.55718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.55765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.55768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.55836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.57637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.57660: stderr chunk (state=3): >>><<< 10896 1726882187.57663: stdout chunk (state=3): >>><<< 10896 1726882187.57674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.57681: handler run complete 10896 1726882187.57722: attempt loop complete, returning result 10896 1726882187.57725: _execute() done 10896 1726882187.57727: dumping result to json 10896 1726882187.57739: done dumping result, returning 10896 1726882187.57748: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-8b02-b216-000000000089] 10896 1726882187.57752: sending task result for task 12673a56-9f93-8b02-b216-000000000089 10896 1726882187.57989: done sending task result for task 12673a56-9f93-8b02-b216-000000000089 10896 1726882187.57991: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882187.58050: no more pending results, returning what we have 10896 1726882187.58053: results queue empty 10896 1726882187.58054: checking for any_errors_fatal 10896 1726882187.58059: done checking for any_errors_fatal 10896 1726882187.58059: checking for max_fail_percentage 10896 1726882187.58061: done checking for max_fail_percentage 10896 1726882187.58061: checking to see if all hosts have failed and the running result is not ok 10896 1726882187.58062: done checking to see if all hosts have failed 10896 1726882187.58063: getting the remaining hosts for this loop 10896 1726882187.58064: done getting the remaining hosts for this loop 10896 1726882187.58067: getting the next task for host managed_node2 10896 1726882187.58073: done getting next task for host managed_node2 10896 1726882187.58077: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10896 1726882187.58081: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882187.58090: getting variables 10896 1726882187.58091: in VariableManager get_vars() 10896 1726882187.58128: Calling all_inventory to load vars for managed_node2 10896 1726882187.58131: Calling groups_inventory to load vars for managed_node2 10896 1726882187.58133: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882187.58141: Calling all_plugins_play to load vars for managed_node2 10896 1726882187.58143: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882187.58146: Calling groups_plugins_play to load vars for managed_node2 10896 1726882187.58892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882187.60341: done with get_vars() 10896 1726882187.60367: done getting variables 10896 1726882187.60431: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:29:47 -0400 (0:00:00.606) 0:00:29.171 ****** 10896 1726882187.60471: entering _queue_task() for managed_node2/service 10896 1726882187.60776: worker is 1 (out of 1 available) 10896 1726882187.60788: exiting _queue_task() for managed_node2/service 10896 1726882187.61003: done queuing things up, now waiting for results queue to drain 10896 1726882187.61005: waiting for pending results... 10896 1726882187.61080: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10896 1726882187.61245: in run() - task 12673a56-9f93-8b02-b216-00000000008a 10896 1726882187.61266: variable 'ansible_search_path' from source: unknown 10896 1726882187.61274: variable 'ansible_search_path' from source: unknown 10896 1726882187.61321: calling self._execute() 10896 1726882187.61427: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.61438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.61458: variable 'omit' from source: magic vars 10896 1726882187.61851: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.61866: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882187.62003: variable 'network_provider' from source: set_fact 10896 1726882187.62101: Evaluated conditional (network_provider == "nm"): True 10896 1726882187.62109: variable '__network_wpa_supplicant_required' from source: role '' defaults 10896 1726882187.62196: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10896 1726882187.62371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882187.64806: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882187.64879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882187.64925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882187.64968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882187.65002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882187.65085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.65153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.65157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.65206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.65229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.65284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.65370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.65373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.65396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.65418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.65462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.65498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.65529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.65572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.65600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.65900: variable 'network_connections' from source: task vars 10896 1726882187.65903: variable 'port2_profile' from source: play vars 10896 1726882187.65906: variable 'port2_profile' from source: play vars 10896 1726882187.65908: variable 'port1_profile' from source: play vars 10896 1726882187.65910: variable 'port1_profile' from source: play vars 10896 1726882187.65912: variable 'controller_profile' from source: play vars 10896 1726882187.65969: variable 'controller_profile' from source: play vars 10896 1726882187.66066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10896 1726882187.66234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10896 1726882187.66277: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10896 1726882187.66315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10896 1726882187.66353: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10896 1726882187.66401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10896 1726882187.66429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10896 1726882187.66463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.66499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10896 1726882187.66550: variable '__network_wireless_connections_defined' from source: role '' defaults 10896 1726882187.66812: variable 'network_connections' from source: task vars 10896 1726882187.66824: variable 'port2_profile' from source: play vars 10896 1726882187.66886: variable 'port2_profile' from source: play vars 10896 1726882187.66907: variable 'port1_profile' from source: play vars 10896 1726882187.66967: variable 'port1_profile' from source: play vars 10896 1726882187.67009: variable 'controller_profile' from source: play vars 10896 1726882187.67059: variable 'controller_profile' from source: play vars 10896 1726882187.67118: Evaluated conditional (__network_wpa_supplicant_required): False 10896 1726882187.67121: when evaluation is False, skipping this task 10896 1726882187.67124: _execute() done 10896 1726882187.67126: dumping result to json 10896 1726882187.67128: done dumping result, returning 10896 1726882187.67133: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-8b02-b216-00000000008a] 10896 1726882187.67227: sending task result for task 12673a56-9f93-8b02-b216-00000000008a 10896 1726882187.67304: done sending task result for task 12673a56-9f93-8b02-b216-00000000008a 10896 1726882187.67307: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10896 1726882187.67376: no more pending results, returning what we have 10896 1726882187.67380: results queue empty 10896 1726882187.67381: checking for any_errors_fatal 10896 1726882187.67405: done checking for any_errors_fatal 10896 1726882187.67406: checking for max_fail_percentage 10896 1726882187.67409: done checking for max_fail_percentage 10896 1726882187.67410: checking to see if all hosts have failed and the running result is not ok 10896 1726882187.67410: done checking to see if all hosts have failed 10896 1726882187.67411: getting the remaining hosts for this loop 10896 1726882187.67413: done getting the remaining hosts for this loop 10896 1726882187.67416: getting the next task for host managed_node2 10896 1726882187.67425: done getting next task for host managed_node2 10896 1726882187.67429: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10896 1726882187.67433: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882187.67452: getting variables 10896 1726882187.67453: in VariableManager get_vars() 10896 1726882187.67599: Calling all_inventory to load vars for managed_node2 10896 1726882187.67603: Calling groups_inventory to load vars for managed_node2 10896 1726882187.67607: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882187.67616: Calling all_plugins_play to load vars for managed_node2 10896 1726882187.67620: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882187.67624: Calling groups_plugins_play to load vars for managed_node2 10896 1726882187.69878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882187.72671: done with get_vars() 10896 1726882187.72706: done getting variables 10896 1726882187.72770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:29:47 -0400 (0:00:00.123) 0:00:29.295 ****** 10896 1726882187.72814: entering _queue_task() for managed_node2/service 10896 1726882187.73172: worker is 1 (out of 1 available) 10896 1726882187.73185: exiting _queue_task() for managed_node2/service 10896 1726882187.73302: done queuing things up, now waiting for results queue to drain 10896 1726882187.73304: waiting for pending results... 10896 1726882187.73619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 10896 1726882187.73711: in run() - task 12673a56-9f93-8b02-b216-00000000008b 10896 1726882187.73734: variable 'ansible_search_path' from source: unknown 10896 1726882187.73743: variable 'ansible_search_path' from source: unknown 10896 1726882187.73784: calling self._execute() 10896 1726882187.74048: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.74053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.74152: variable 'omit' from source: magic vars 10896 1726882187.75099: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.75103: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882187.75244: variable 'network_provider' from source: set_fact 10896 1726882187.75300: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882187.75417: when evaluation is False, skipping this task 10896 1726882187.75419: _execute() done 10896 1726882187.75421: dumping result to json 10896 1726882187.75425: done dumping result, returning 10896 1726882187.75428: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-8b02-b216-00000000008b] 10896 1726882187.75431: sending task result for task 12673a56-9f93-8b02-b216-00000000008b 10896 1726882187.75505: done sending task result for task 12673a56-9f93-8b02-b216-00000000008b 10896 1726882187.75508: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10896 1726882187.75566: no more pending results, returning what we have 10896 1726882187.75571: results queue empty 10896 1726882187.75572: checking for any_errors_fatal 10896 1726882187.75582: done checking for any_errors_fatal 10896 1726882187.75583: checking for max_fail_percentage 10896 1726882187.75585: done checking for max_fail_percentage 10896 1726882187.75586: checking to see if all hosts have failed and the running result is not ok 10896 1726882187.75588: done checking to see if all hosts have failed 10896 1726882187.75588: getting the remaining hosts for this loop 10896 1726882187.75590: done getting the remaining hosts for this loop 10896 1726882187.75597: getting the next task for host managed_node2 10896 1726882187.75606: done getting next task for host managed_node2 10896 1726882187.75610: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10896 1726882187.75615: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882187.75639: getting variables 10896 1726882187.75641: in VariableManager get_vars() 10896 1726882187.75686: Calling all_inventory to load vars for managed_node2 10896 1726882187.75689: Calling groups_inventory to load vars for managed_node2 10896 1726882187.75692: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882187.75954: Calling all_plugins_play to load vars for managed_node2 10896 1726882187.75957: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882187.75960: Calling groups_plugins_play to load vars for managed_node2 10896 1726882187.76870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882187.77736: done with get_vars() 10896 1726882187.77755: done getting variables 10896 1726882187.77800: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:29:47 -0400 (0:00:00.050) 0:00:29.345 ****** 10896 1726882187.77827: entering _queue_task() for managed_node2/copy 10896 1726882187.78084: worker is 1 (out of 1 available) 10896 1726882187.78300: exiting _queue_task() for managed_node2/copy 10896 1726882187.78312: done queuing things up, now waiting for results queue to drain 10896 1726882187.78313: waiting for pending results... 10896 1726882187.78404: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10896 1726882187.78646: in run() - task 12673a56-9f93-8b02-b216-00000000008c 10896 1726882187.78649: variable 'ansible_search_path' from source: unknown 10896 1726882187.78652: variable 'ansible_search_path' from source: unknown 10896 1726882187.78654: calling self._execute() 10896 1726882187.78722: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.78734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.78753: variable 'omit' from source: magic vars 10896 1726882187.79138: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.79147: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882187.79232: variable 'network_provider' from source: set_fact 10896 1726882187.79236: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882187.79239: when evaluation is False, skipping this task 10896 1726882187.79242: _execute() done 10896 1726882187.79245: dumping result to json 10896 1726882187.79247: done dumping result, returning 10896 1726882187.79256: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-8b02-b216-00000000008c] 10896 1726882187.79260: sending task result for task 12673a56-9f93-8b02-b216-00000000008c 10896 1726882187.79352: done sending task result for task 12673a56-9f93-8b02-b216-00000000008c 10896 1726882187.79355: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10896 1726882187.79403: no more pending results, returning what we have 10896 1726882187.79406: results queue empty 10896 1726882187.79407: checking for any_errors_fatal 10896 1726882187.79413: done checking for any_errors_fatal 10896 1726882187.79414: checking for max_fail_percentage 10896 1726882187.79415: done checking for max_fail_percentage 10896 1726882187.79416: checking to see if all hosts have failed and the running result is not ok 10896 1726882187.79417: done checking to see if all hosts have failed 10896 1726882187.79418: getting the remaining hosts for this loop 10896 1726882187.79419: done getting the remaining hosts for this loop 10896 1726882187.79422: getting the next task for host managed_node2 10896 1726882187.79429: done getting next task for host managed_node2 10896 1726882187.79432: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10896 1726882187.79436: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882187.79456: getting variables 10896 1726882187.79458: in VariableManager get_vars() 10896 1726882187.79496: Calling all_inventory to load vars for managed_node2 10896 1726882187.79499: Calling groups_inventory to load vars for managed_node2 10896 1726882187.79501: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882187.79510: Calling all_plugins_play to load vars for managed_node2 10896 1726882187.79512: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882187.79515: Calling groups_plugins_play to load vars for managed_node2 10896 1726882187.80349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882187.81191: done with get_vars() 10896 1726882187.81209: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:29:47 -0400 (0:00:00.034) 0:00:29.379 ****** 10896 1726882187.81268: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 10896 1726882187.81476: worker is 1 (out of 1 available) 10896 1726882187.81489: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 10896 1726882187.81503: done queuing things up, now waiting for results queue to drain 10896 1726882187.81505: waiting for pending results... 10896 1726882187.81672: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10896 1726882187.81751: in run() - task 12673a56-9f93-8b02-b216-00000000008d 10896 1726882187.81763: variable 'ansible_search_path' from source: unknown 10896 1726882187.81767: variable 'ansible_search_path' from source: unknown 10896 1726882187.81795: calling self._execute() 10896 1726882187.81870: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.81874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.81881: variable 'omit' from source: magic vars 10896 1726882187.82148: variable 'ansible_distribution_major_version' from source: facts 10896 1726882187.82157: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882187.82161: variable 'omit' from source: magic vars 10896 1726882187.82211: variable 'omit' from source: magic vars 10896 1726882187.82322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10896 1726882187.83739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10896 1726882187.83780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10896 1726882187.83812: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10896 1726882187.83839: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10896 1726882187.83857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10896 1726882187.83911: variable 'network_provider' from source: set_fact 10896 1726882187.83998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10896 1726882187.84035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10896 1726882187.84054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10896 1726882187.84080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10896 1726882187.84090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10896 1726882187.84148: variable 'omit' from source: magic vars 10896 1726882187.84225: variable 'omit' from source: magic vars 10896 1726882187.84296: variable 'network_connections' from source: task vars 10896 1726882187.84306: variable 'port2_profile' from source: play vars 10896 1726882187.84348: variable 'port2_profile' from source: play vars 10896 1726882187.84357: variable 'port1_profile' from source: play vars 10896 1726882187.84403: variable 'port1_profile' from source: play vars 10896 1726882187.84410: variable 'controller_profile' from source: play vars 10896 1726882187.84451: variable 'controller_profile' from source: play vars 10896 1726882187.84561: variable 'omit' from source: magic vars 10896 1726882187.84568: variable '__lsr_ansible_managed' from source: task vars 10896 1726882187.84614: variable '__lsr_ansible_managed' from source: task vars 10896 1726882187.84731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10896 1726882187.84874: Loaded config def from plugin (lookup/template) 10896 1726882187.84878: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10896 1726882187.84900: File lookup term: get_ansible_managed.j2 10896 1726882187.84903: variable 'ansible_search_path' from source: unknown 10896 1726882187.84908: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10896 1726882187.84998: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10896 1726882187.85002: variable 'ansible_search_path' from source: unknown 10896 1726882187.88301: variable 'ansible_managed' from source: unknown 10896 1726882187.88378: variable 'omit' from source: magic vars 10896 1726882187.88400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882187.88418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882187.88432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882187.88446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882187.88454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882187.88474: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882187.88478: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.88481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.88545: Set connection var ansible_connection to ssh 10896 1726882187.88548: Set connection var ansible_timeout to 10 10896 1726882187.88551: Set connection var ansible_shell_type to sh 10896 1726882187.88558: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882187.88563: Set connection var ansible_shell_executable to /bin/sh 10896 1726882187.88568: Set connection var ansible_pipelining to False 10896 1726882187.88585: variable 'ansible_shell_executable' from source: unknown 10896 1726882187.88588: variable 'ansible_connection' from source: unknown 10896 1726882187.88592: variable 'ansible_module_compression' from source: unknown 10896 1726882187.88597: variable 'ansible_shell_type' from source: unknown 10896 1726882187.88599: variable 'ansible_shell_executable' from source: unknown 10896 1726882187.88601: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882187.88609: variable 'ansible_pipelining' from source: unknown 10896 1726882187.88612: variable 'ansible_timeout' from source: unknown 10896 1726882187.88623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882187.88699: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882187.88709: variable 'omit' from source: magic vars 10896 1726882187.88714: starting attempt loop 10896 1726882187.88716: running the handler 10896 1726882187.88728: _low_level_execute_command(): starting 10896 1726882187.88734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882187.89223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.89228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882187.89231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.89280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.89283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.89285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.89361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.91043: stdout chunk (state=3): >>>/root <<< 10896 1726882187.91139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.91166: stderr chunk (state=3): >>><<< 10896 1726882187.91169: stdout chunk (state=3): >>><<< 10896 1726882187.91188: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.91199: _low_level_execute_command(): starting 10896 1726882187.91205: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067 `" && echo ansible-tmp-1726882187.9118733-12353-117060925188067="` echo /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067 `" ) && sleep 0' 10896 1726882187.91772: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.91778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.91792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.91844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.91847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.91916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.93786: stdout chunk (state=3): >>>ansible-tmp-1726882187.9118733-12353-117060925188067=/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067 <<< 10896 1726882187.93939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.93942: stdout chunk (state=3): >>><<< 10896 1726882187.93944: stderr chunk (state=3): >>><<< 10896 1726882187.94098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882187.9118733-12353-117060925188067=/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882187.94102: variable 'ansible_module_compression' from source: unknown 10896 1726882187.94104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10896 1726882187.94106: variable 'ansible_facts' from source: unknown 10896 1726882187.94258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py 10896 1726882187.94461: Sending initial data 10896 1726882187.94464: Sent initial data (168 bytes) 10896 1726882187.95121: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.95402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.95439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.95458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.95728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882187.97225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882187.97305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882187.97369: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp4up7az95 /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py <<< 10896 1726882187.97372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py" <<< 10896 1726882187.97425: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp4up7az95" to remote "/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py" <<< 10896 1726882187.97428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py" <<< 10896 1726882187.98700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882187.98703: stdout chunk (state=3): >>><<< 10896 1726882187.98705: stderr chunk (state=3): >>><<< 10896 1726882187.98708: done transferring module to remote 10896 1726882187.98710: _low_level_execute_command(): starting 10896 1726882187.98712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/ /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py && sleep 0' 10896 1726882187.99428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882187.99431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882187.99434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882187.99436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882187.99570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882187.99573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882187.99576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882187.99578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882187.99809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882188.01540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882188.01543: stdout chunk (state=3): >>><<< 10896 1726882188.01549: stderr chunk (state=3): >>><<< 10896 1726882188.01565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882188.01567: _low_level_execute_command(): starting 10896 1726882188.01592: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/AnsiballZ_network_connections.py && sleep 0' 10896 1726882188.02680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882188.02684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882188.02686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882188.02689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882188.02691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882188.02900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882188.02914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882188.02989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882188.53797: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb: error=unknown <<< 10896 1726882188.55595: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/44428abe-10c8-4edc-8c8f-782783521b9e: error=unknown <<< 10896 1726882188.57225: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/dd49500a-dfac-4315-8e04-f2fc7751ae5d: error=unknown <<< 10896 1726882188.57471: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10896 1726882188.59362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882188.59656: stdout chunk (state=3): >>><<< 10896 1726882188.59660: stderr chunk (state=3): >>><<< 10896 1726882188.59663: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/68ad5ee6-abdc-455f-b5cc-5daf41c6bbbb: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/44428abe-10c8-4edc-8c8f-782783521b9e: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90t_a955/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/dd49500a-dfac-4315-8e04-f2fc7751ae5d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882188.59666: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882188.59674: _low_level_execute_command(): starting 10896 1726882188.59676: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882187.9118733-12353-117060925188067/ > /dev/null 2>&1 && sleep 0' 10896 1726882188.60795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882188.60799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882188.60812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882188.60854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882188.60864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882188.60915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882188.60928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882188.61010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882188.61144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882188.62999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882188.63019: stderr chunk (state=3): >>><<< 10896 1726882188.63030: stdout chunk (state=3): >>><<< 10896 1726882188.63051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882188.63134: handler run complete 10896 1726882188.63177: attempt loop complete, returning result 10896 1726882188.63502: _execute() done 10896 1726882188.63505: dumping result to json 10896 1726882188.63507: done dumping result, returning 10896 1726882188.63510: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-8b02-b216-00000000008d] 10896 1726882188.63512: sending task result for task 12673a56-9f93-8b02-b216-00000000008d changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 10896 1726882188.63910: no more pending results, returning what we have 10896 1726882188.63913: results queue empty 10896 1726882188.63914: checking for any_errors_fatal 10896 1726882188.63920: done checking for any_errors_fatal 10896 1726882188.63921: checking for max_fail_percentage 10896 1726882188.63923: done checking for max_fail_percentage 10896 1726882188.63924: checking to see if all hosts have failed and the running result is not ok 10896 1726882188.63924: done checking to see if all hosts have failed 10896 1726882188.63925: getting the remaining hosts for this loop 10896 1726882188.63926: done getting the remaining hosts for this loop 10896 1726882188.63929: getting the next task for host managed_node2 10896 1726882188.63936: done getting next task for host managed_node2 10896 1726882188.63939: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10896 1726882188.63943: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882188.63952: getting variables 10896 1726882188.63954: in VariableManager get_vars() 10896 1726882188.63990: Calling all_inventory to load vars for managed_node2 10896 1726882188.64012: Calling groups_inventory to load vars for managed_node2 10896 1726882188.64015: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882188.64026: Calling all_plugins_play to load vars for managed_node2 10896 1726882188.64028: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882188.64031: Calling groups_plugins_play to load vars for managed_node2 10896 1726882188.64823: done sending task result for task 12673a56-9f93-8b02-b216-00000000008d 10896 1726882188.64830: WORKER PROCESS EXITING 10896 1726882188.66871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882188.70740: done with get_vars() 10896 1726882188.70763: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:29:48 -0400 (0:00:00.897) 0:00:30.276 ****** 10896 1726882188.70974: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 10896 1726882188.71743: worker is 1 (out of 1 available) 10896 1726882188.71755: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 10896 1726882188.71767: done queuing things up, now waiting for results queue to drain 10896 1726882188.71768: waiting for pending results... 10896 1726882188.72101: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 10896 1726882188.72430: in run() - task 12673a56-9f93-8b02-b216-00000000008e 10896 1726882188.72452: variable 'ansible_search_path' from source: unknown 10896 1726882188.72899: variable 'ansible_search_path' from source: unknown 10896 1726882188.72903: calling self._execute() 10896 1726882188.72907: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.72911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.72914: variable 'omit' from source: magic vars 10896 1726882188.73428: variable 'ansible_distribution_major_version' from source: facts 10896 1726882188.73613: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882188.73938: variable 'network_state' from source: role '' defaults 10896 1726882188.73954: Evaluated conditional (network_state != {}): False 10896 1726882188.73962: when evaluation is False, skipping this task 10896 1726882188.73969: _execute() done 10896 1726882188.73978: dumping result to json 10896 1726882188.73986: done dumping result, returning 10896 1726882188.74001: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-8b02-b216-00000000008e] 10896 1726882188.74012: sending task result for task 12673a56-9f93-8b02-b216-00000000008e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10896 1726882188.74164: no more pending results, returning what we have 10896 1726882188.74167: results queue empty 10896 1726882188.74168: checking for any_errors_fatal 10896 1726882188.74177: done checking for any_errors_fatal 10896 1726882188.74178: checking for max_fail_percentage 10896 1726882188.74179: done checking for max_fail_percentage 10896 1726882188.74180: checking to see if all hosts have failed and the running result is not ok 10896 1726882188.74181: done checking to see if all hosts have failed 10896 1726882188.74182: getting the remaining hosts for this loop 10896 1726882188.74183: done getting the remaining hosts for this loop 10896 1726882188.74186: getting the next task for host managed_node2 10896 1726882188.74198: done getting next task for host managed_node2 10896 1726882188.74202: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10896 1726882188.74211: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882188.74224: done sending task result for task 12673a56-9f93-8b02-b216-00000000008e 10896 1726882188.74226: WORKER PROCESS EXITING 10896 1726882188.74242: getting variables 10896 1726882188.74244: in VariableManager get_vars() 10896 1726882188.74284: Calling all_inventory to load vars for managed_node2 10896 1726882188.74287: Calling groups_inventory to load vars for managed_node2 10896 1726882188.74289: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882188.74356: Calling all_plugins_play to load vars for managed_node2 10896 1726882188.74360: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882188.74364: Calling groups_plugins_play to load vars for managed_node2 10896 1726882188.77238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882188.81328: done with get_vars() 10896 1726882188.81358: done getting variables 10896 1726882188.81832: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:29:48 -0400 (0:00:00.108) 0:00:30.385 ****** 10896 1726882188.81872: entering _queue_task() for managed_node2/debug 10896 1726882188.82828: worker is 1 (out of 1 available) 10896 1726882188.82838: exiting _queue_task() for managed_node2/debug 10896 1726882188.82848: done queuing things up, now waiting for results queue to drain 10896 1726882188.82849: waiting for pending results... 10896 1726882188.82997: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10896 1726882188.83449: in run() - task 12673a56-9f93-8b02-b216-00000000008f 10896 1726882188.83474: variable 'ansible_search_path' from source: unknown 10896 1726882188.83483: variable 'ansible_search_path' from source: unknown 10896 1726882188.83537: calling self._execute() 10896 1726882188.83901: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.83905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.83908: variable 'omit' from source: magic vars 10896 1726882188.84682: variable 'ansible_distribution_major_version' from source: facts 10896 1726882188.84702: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882188.84713: variable 'omit' from source: magic vars 10896 1726882188.84772: variable 'omit' from source: magic vars 10896 1726882188.84828: variable 'omit' from source: magic vars 10896 1726882188.84947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882188.85049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882188.85077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882188.85139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882188.85214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882188.85255: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882188.85264: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.85342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.85514: Set connection var ansible_connection to ssh 10896 1726882188.85527: Set connection var ansible_timeout to 10 10896 1726882188.85557: Set connection var ansible_shell_type to sh 10896 1726882188.85571: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882188.85901: Set connection var ansible_shell_executable to /bin/sh 10896 1726882188.85904: Set connection var ansible_pipelining to False 10896 1726882188.85907: variable 'ansible_shell_executable' from source: unknown 10896 1726882188.85909: variable 'ansible_connection' from source: unknown 10896 1726882188.85912: variable 'ansible_module_compression' from source: unknown 10896 1726882188.85914: variable 'ansible_shell_type' from source: unknown 10896 1726882188.85916: variable 'ansible_shell_executable' from source: unknown 10896 1726882188.85918: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.85920: variable 'ansible_pipelining' from source: unknown 10896 1726882188.85923: variable 'ansible_timeout' from source: unknown 10896 1726882188.85925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.86045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882188.86065: variable 'omit' from source: magic vars 10896 1726882188.86125: starting attempt loop 10896 1726882188.86132: running the handler 10896 1726882188.86404: variable '__network_connections_result' from source: set_fact 10896 1726882188.86573: handler run complete 10896 1726882188.86598: attempt loop complete, returning result 10896 1726882188.86607: _execute() done 10896 1726882188.86614: dumping result to json 10896 1726882188.86621: done dumping result, returning 10896 1726882188.86633: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-8b02-b216-00000000008f] 10896 1726882188.86641: sending task result for task 12673a56-9f93-8b02-b216-00000000008f 10896 1726882188.86753: done sending task result for task 12673a56-9f93-8b02-b216-00000000008f 10896 1726882188.86761: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 10896 1726882188.86840: no more pending results, returning what we have 10896 1726882188.86845: results queue empty 10896 1726882188.86846: checking for any_errors_fatal 10896 1726882188.86854: done checking for any_errors_fatal 10896 1726882188.86855: checking for max_fail_percentage 10896 1726882188.86856: done checking for max_fail_percentage 10896 1726882188.86857: checking to see if all hosts have failed and the running result is not ok 10896 1726882188.86858: done checking to see if all hosts have failed 10896 1726882188.86859: getting the remaining hosts for this loop 10896 1726882188.86860: done getting the remaining hosts for this loop 10896 1726882188.86863: getting the next task for host managed_node2 10896 1726882188.86872: done getting next task for host managed_node2 10896 1726882188.86876: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10896 1726882188.86880: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882188.86892: getting variables 10896 1726882188.86897: in VariableManager get_vars() 10896 1726882188.86940: Calling all_inventory to load vars for managed_node2 10896 1726882188.86943: Calling groups_inventory to load vars for managed_node2 10896 1726882188.86946: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882188.86956: Calling all_plugins_play to load vars for managed_node2 10896 1726882188.86959: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882188.86962: Calling groups_plugins_play to load vars for managed_node2 10896 1726882188.90251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882188.93438: done with get_vars() 10896 1726882188.93470: done getting variables 10896 1726882188.93533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:29:48 -0400 (0:00:00.116) 0:00:30.502 ****** 10896 1726882188.93570: entering _queue_task() for managed_node2/debug 10896 1726882188.94517: worker is 1 (out of 1 available) 10896 1726882188.94528: exiting _queue_task() for managed_node2/debug 10896 1726882188.94539: done queuing things up, now waiting for results queue to drain 10896 1726882188.94541: waiting for pending results... 10896 1726882188.94900: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10896 1726882188.95203: in run() - task 12673a56-9f93-8b02-b216-000000000090 10896 1726882188.95280: variable 'ansible_search_path' from source: unknown 10896 1726882188.95285: variable 'ansible_search_path' from source: unknown 10896 1726882188.95321: calling self._execute() 10896 1726882188.95511: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.95517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.95528: variable 'omit' from source: magic vars 10896 1726882188.96370: variable 'ansible_distribution_major_version' from source: facts 10896 1726882188.96440: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882188.96443: variable 'omit' from source: magic vars 10896 1726882188.96601: variable 'omit' from source: magic vars 10896 1726882188.96638: variable 'omit' from source: magic vars 10896 1726882188.96796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882188.96856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882188.96859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882188.96867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882188.96879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882188.97073: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882188.97076: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.97079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.97900: Set connection var ansible_connection to ssh 10896 1726882188.97904: Set connection var ansible_timeout to 10 10896 1726882188.97907: Set connection var ansible_shell_type to sh 10896 1726882188.97909: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882188.97910: Set connection var ansible_shell_executable to /bin/sh 10896 1726882188.97912: Set connection var ansible_pipelining to False 10896 1726882188.97914: variable 'ansible_shell_executable' from source: unknown 10896 1726882188.97916: variable 'ansible_connection' from source: unknown 10896 1726882188.97918: variable 'ansible_module_compression' from source: unknown 10896 1726882188.97920: variable 'ansible_shell_type' from source: unknown 10896 1726882188.97922: variable 'ansible_shell_executable' from source: unknown 10896 1726882188.97924: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882188.97925: variable 'ansible_pipelining' from source: unknown 10896 1726882188.97927: variable 'ansible_timeout' from source: unknown 10896 1726882188.97929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882188.98225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882188.98234: variable 'omit' from source: magic vars 10896 1726882188.98239: starting attempt loop 10896 1726882188.98242: running the handler 10896 1726882188.98288: variable '__network_connections_result' from source: set_fact 10896 1726882188.98369: variable '__network_connections_result' from source: set_fact 10896 1726882188.98691: handler run complete 10896 1726882188.98719: attempt loop complete, returning result 10896 1726882188.98722: _execute() done 10896 1726882188.98724: dumping result to json 10896 1726882188.98811: done dumping result, returning 10896 1726882188.98814: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-8b02-b216-000000000090] 10896 1726882188.98817: sending task result for task 12673a56-9f93-8b02-b216-000000000090 10896 1726882188.98887: done sending task result for task 12673a56-9f93-8b02-b216-000000000090 10896 1726882188.98891: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 10896 1726882188.99008: no more pending results, returning what we have 10896 1726882188.99011: results queue empty 10896 1726882188.99012: checking for any_errors_fatal 10896 1726882188.99019: done checking for any_errors_fatal 10896 1726882188.99020: checking for max_fail_percentage 10896 1726882188.99022: done checking for max_fail_percentage 10896 1726882188.99023: checking to see if all hosts have failed and the running result is not ok 10896 1726882188.99024: done checking to see if all hosts have failed 10896 1726882188.99024: getting the remaining hosts for this loop 10896 1726882188.99026: done getting the remaining hosts for this loop 10896 1726882188.99028: getting the next task for host managed_node2 10896 1726882188.99035: done getting next task for host managed_node2 10896 1726882188.99039: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10896 1726882188.99043: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882188.99053: getting variables 10896 1726882188.99055: in VariableManager get_vars() 10896 1726882188.99197: Calling all_inventory to load vars for managed_node2 10896 1726882188.99201: Calling groups_inventory to load vars for managed_node2 10896 1726882188.99204: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882188.99214: Calling all_plugins_play to load vars for managed_node2 10896 1726882188.99225: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882188.99228: Calling groups_plugins_play to load vars for managed_node2 10896 1726882189.01742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882189.04756: done with get_vars() 10896 1726882189.04816: done getting variables 10896 1726882189.04874: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:29:49 -0400 (0:00:00.113) 0:00:30.616 ****** 10896 1726882189.04920: entering _queue_task() for managed_node2/debug 10896 1726882189.05399: worker is 1 (out of 1 available) 10896 1726882189.05413: exiting _queue_task() for managed_node2/debug 10896 1726882189.05429: done queuing things up, now waiting for results queue to drain 10896 1726882189.05431: waiting for pending results... 10896 1726882189.05762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10896 1726882189.05866: in run() - task 12673a56-9f93-8b02-b216-000000000091 10896 1726882189.05887: variable 'ansible_search_path' from source: unknown 10896 1726882189.05900: variable 'ansible_search_path' from source: unknown 10896 1726882189.05941: calling self._execute() 10896 1726882189.06053: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.06064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.06090: variable 'omit' from source: magic vars 10896 1726882189.06584: variable 'ansible_distribution_major_version' from source: facts 10896 1726882189.06588: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882189.06667: variable 'network_state' from source: role '' defaults 10896 1726882189.06682: Evaluated conditional (network_state != {}): False 10896 1726882189.06697: when evaluation is False, skipping this task 10896 1726882189.06705: _execute() done 10896 1726882189.06712: dumping result to json 10896 1726882189.06719: done dumping result, returning 10896 1726882189.06730: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-8b02-b216-000000000091] 10896 1726882189.06739: sending task result for task 12673a56-9f93-8b02-b216-000000000091 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 10896 1726882189.06984: no more pending results, returning what we have 10896 1726882189.06989: results queue empty 10896 1726882189.06990: checking for any_errors_fatal 10896 1726882189.07004: done checking for any_errors_fatal 10896 1726882189.07005: checking for max_fail_percentage 10896 1726882189.07008: done checking for max_fail_percentage 10896 1726882189.07009: checking to see if all hosts have failed and the running result is not ok 10896 1726882189.07009: done checking to see if all hosts have failed 10896 1726882189.07010: getting the remaining hosts for this loop 10896 1726882189.07012: done getting the remaining hosts for this loop 10896 1726882189.07015: getting the next task for host managed_node2 10896 1726882189.07024: done getting next task for host managed_node2 10896 1726882189.07028: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10896 1726882189.07033: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882189.07056: getting variables 10896 1726882189.07058: in VariableManager get_vars() 10896 1726882189.07283: Calling all_inventory to load vars for managed_node2 10896 1726882189.07287: Calling groups_inventory to load vars for managed_node2 10896 1726882189.07289: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882189.07299: done sending task result for task 12673a56-9f93-8b02-b216-000000000091 10896 1726882189.07302: WORKER PROCESS EXITING 10896 1726882189.07310: Calling all_plugins_play to load vars for managed_node2 10896 1726882189.07313: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882189.07316: Calling groups_plugins_play to load vars for managed_node2 10896 1726882189.14402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882189.16684: done with get_vars() 10896 1726882189.16737: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:29:49 -0400 (0:00:00.119) 0:00:30.735 ****** 10896 1726882189.16847: entering _queue_task() for managed_node2/ping 10896 1726882189.17247: worker is 1 (out of 1 available) 10896 1726882189.17260: exiting _queue_task() for managed_node2/ping 10896 1726882189.17273: done queuing things up, now waiting for results queue to drain 10896 1726882189.17274: waiting for pending results... 10896 1726882189.17578: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10896 1726882189.17778: in run() - task 12673a56-9f93-8b02-b216-000000000092 10896 1726882189.17803: variable 'ansible_search_path' from source: unknown 10896 1726882189.17813: variable 'ansible_search_path' from source: unknown 10896 1726882189.17860: calling self._execute() 10896 1726882189.17989: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.18048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.18052: variable 'omit' from source: magic vars 10896 1726882189.18454: variable 'ansible_distribution_major_version' from source: facts 10896 1726882189.18471: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882189.18538: variable 'omit' from source: magic vars 10896 1726882189.18623: variable 'omit' from source: magic vars 10896 1726882189.18681: variable 'omit' from source: magic vars 10896 1726882189.18825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882189.18829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882189.18832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882189.18851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882189.18879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882189.18934: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882189.18943: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.18952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.19089: Set connection var ansible_connection to ssh 10896 1726882189.19107: Set connection var ansible_timeout to 10 10896 1726882189.19115: Set connection var ansible_shell_type to sh 10896 1726882189.19142: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882189.19175: Set connection var ansible_shell_executable to /bin/sh 10896 1726882189.19247: Set connection var ansible_pipelining to False 10896 1726882189.19250: variable 'ansible_shell_executable' from source: unknown 10896 1726882189.19252: variable 'ansible_connection' from source: unknown 10896 1726882189.19254: variable 'ansible_module_compression' from source: unknown 10896 1726882189.19256: variable 'ansible_shell_type' from source: unknown 10896 1726882189.19257: variable 'ansible_shell_executable' from source: unknown 10896 1726882189.19259: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.19261: variable 'ansible_pipelining' from source: unknown 10896 1726882189.19262: variable 'ansible_timeout' from source: unknown 10896 1726882189.19264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.19550: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10896 1726882189.19574: variable 'omit' from source: magic vars 10896 1726882189.19684: starting attempt loop 10896 1726882189.19687: running the handler 10896 1726882189.19689: _low_level_execute_command(): starting 10896 1726882189.19691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882189.20348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882189.20365: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 10896 1726882189.20412: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.20476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.20496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.20521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.20616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.22270: stdout chunk (state=3): >>>/root <<< 10896 1726882189.22404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.22427: stdout chunk (state=3): >>><<< 10896 1726882189.22440: stderr chunk (state=3): >>><<< 10896 1726882189.22462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.22477: _low_level_execute_command(): starting 10896 1726882189.22548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476 `" && echo ansible-tmp-1726882189.2246702-12404-165217454577476="` echo /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476 `" ) && sleep 0' 10896 1726882189.23133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.23157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.23171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.23190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882189.23229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882189.23259: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882189.23315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.23379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.23400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.23445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.23517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.25403: stdout chunk (state=3): >>>ansible-tmp-1726882189.2246702-12404-165217454577476=/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476 <<< 10896 1726882189.25553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.25570: stderr chunk (state=3): >>><<< 10896 1726882189.25588: stdout chunk (state=3): >>><<< 10896 1726882189.25801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882189.2246702-12404-165217454577476=/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.25805: variable 'ansible_module_compression' from source: unknown 10896 1726882189.25807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10896 1726882189.25809: variable 'ansible_facts' from source: unknown 10896 1726882189.25865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py 10896 1726882189.26055: Sending initial data 10896 1726882189.26182: Sent initial data (153 bytes) 10896 1726882189.26820: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.26841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.26864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.26884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882189.26906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882189.26923: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882189.26947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.26977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.27037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.27122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.27161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.27271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.28829: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882189.28895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882189.28961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp4tzvubx9 /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py <<< 10896 1726882189.28965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py" <<< 10896 1726882189.29039: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp4tzvubx9" to remote "/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py" <<< 10896 1726882189.29965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.29969: stdout chunk (state=3): >>><<< 10896 1726882189.29971: stderr chunk (state=3): >>><<< 10896 1726882189.30100: done transferring module to remote 10896 1726882189.30104: _low_level_execute_command(): starting 10896 1726882189.30107: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/ /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py && sleep 0' 10896 1726882189.30784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.30811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.30889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.30956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.30973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.31034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.31121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.32863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.32866: stdout chunk (state=3): >>><<< 10896 1726882189.32868: stderr chunk (state=3): >>><<< 10896 1726882189.32958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.32962: _low_level_execute_command(): starting 10896 1726882189.32964: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/AnsiballZ_ping.py && sleep 0' 10896 1726882189.33507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.33523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.33546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.33612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.33674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.33692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.33719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.33820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.48716: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10896 1726882189.49921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882189.49925: stdout chunk (state=3): >>><<< 10896 1726882189.49927: stderr chunk (state=3): >>><<< 10896 1726882189.49930: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882189.50169: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882189.50172: _low_level_execute_command(): starting 10896 1726882189.50175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882189.2246702-12404-165217454577476/ > /dev/null 2>&1 && sleep 0' 10896 1726882189.51315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.51328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.51404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.51518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.51611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.51684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.53534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.53542: stderr chunk (state=3): >>><<< 10896 1726882189.53547: stdout chunk (state=3): >>><<< 10896 1726882189.53573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.53579: handler run complete 10896 1726882189.53599: attempt loop complete, returning result 10896 1726882189.53602: _execute() done 10896 1726882189.53605: dumping result to json 10896 1726882189.53607: done dumping result, returning 10896 1726882189.53616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-8b02-b216-000000000092] 10896 1726882189.53621: sending task result for task 12673a56-9f93-8b02-b216-000000000092 10896 1726882189.53926: done sending task result for task 12673a56-9f93-8b02-b216-000000000092 10896 1726882189.53929: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 10896 1726882189.53979: no more pending results, returning what we have 10896 1726882189.53982: results queue empty 10896 1726882189.53983: checking for any_errors_fatal 10896 1726882189.53989: done checking for any_errors_fatal 10896 1726882189.53989: checking for max_fail_percentage 10896 1726882189.53991: done checking for max_fail_percentage 10896 1726882189.53992: checking to see if all hosts have failed and the running result is not ok 10896 1726882189.53996: done checking to see if all hosts have failed 10896 1726882189.53997: getting the remaining hosts for this loop 10896 1726882189.53998: done getting the remaining hosts for this loop 10896 1726882189.54001: getting the next task for host managed_node2 10896 1726882189.54009: done getting next task for host managed_node2 10896 1726882189.54012: ^ task is: TASK: meta (role_complete) 10896 1726882189.54016: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882189.54028: getting variables 10896 1726882189.54030: in VariableManager get_vars() 10896 1726882189.54066: Calling all_inventory to load vars for managed_node2 10896 1726882189.54069: Calling groups_inventory to load vars for managed_node2 10896 1726882189.54072: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882189.54080: Calling all_plugins_play to load vars for managed_node2 10896 1726882189.54083: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882189.54086: Calling groups_plugins_play to load vars for managed_node2 10896 1726882189.56126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882189.57732: done with get_vars() 10896 1726882189.57754: done getting variables 10896 1726882189.57841: done queuing things up, now waiting for results queue to drain 10896 1726882189.57843: results queue empty 10896 1726882189.57843: checking for any_errors_fatal 10896 1726882189.57846: done checking for any_errors_fatal 10896 1726882189.57847: checking for max_fail_percentage 10896 1726882189.57848: done checking for max_fail_percentage 10896 1726882189.57849: checking to see if all hosts have failed and the running result is not ok 10896 1726882189.57849: done checking to see if all hosts have failed 10896 1726882189.57850: getting the remaining hosts for this loop 10896 1726882189.57851: done getting the remaining hosts for this loop 10896 1726882189.57853: getting the next task for host managed_node2 10896 1726882189.57858: done getting next task for host managed_node2 10896 1726882189.57860: ^ task is: TASK: Delete the device '{{ controller_device }}' 10896 1726882189.57862: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882189.57864: getting variables 10896 1726882189.57865: in VariableManager get_vars() 10896 1726882189.57878: Calling all_inventory to load vars for managed_node2 10896 1726882189.57880: Calling groups_inventory to load vars for managed_node2 10896 1726882189.57882: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882189.57886: Calling all_plugins_play to load vars for managed_node2 10896 1726882189.57889: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882189.57891: Calling groups_plugins_play to load vars for managed_node2 10896 1726882189.59680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882189.61400: done with get_vars() 10896 1726882189.61424: done getting variables 10896 1726882189.61470: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 10896 1726882189.61598: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Friday 20 September 2024 21:29:49 -0400 (0:00:00.447) 0:00:31.183 ****** 10896 1726882189.61631: entering _queue_task() for managed_node2/command 10896 1726882189.62036: worker is 1 (out of 1 available) 10896 1726882189.62053: exiting _queue_task() for managed_node2/command 10896 1726882189.62065: done queuing things up, now waiting for results queue to drain 10896 1726882189.62067: waiting for pending results... 10896 1726882189.62407: running TaskExecutor() for managed_node2/TASK: Delete the device 'deprecated-bond' 10896 1726882189.62674: in run() - task 12673a56-9f93-8b02-b216-0000000000c2 10896 1726882189.62698: variable 'ansible_search_path' from source: unknown 10896 1726882189.62738: calling self._execute() 10896 1726882189.62857: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.62870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.62890: variable 'omit' from source: magic vars 10896 1726882189.63292: variable 'ansible_distribution_major_version' from source: facts 10896 1726882189.63314: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882189.63327: variable 'omit' from source: magic vars 10896 1726882189.63353: variable 'omit' from source: magic vars 10896 1726882189.63460: variable 'controller_device' from source: play vars 10896 1726882189.63544: variable 'omit' from source: magic vars 10896 1726882189.63548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882189.63595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882189.63621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882189.63644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882189.63674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882189.63763: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882189.63767: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.63773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.63889: Set connection var ansible_connection to ssh 10896 1726882189.63904: Set connection var ansible_timeout to 10 10896 1726882189.63912: Set connection var ansible_shell_type to sh 10896 1726882189.63980: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882189.63984: Set connection var ansible_shell_executable to /bin/sh 10896 1726882189.63987: Set connection var ansible_pipelining to False 10896 1726882189.63990: variable 'ansible_shell_executable' from source: unknown 10896 1726882189.63992: variable 'ansible_connection' from source: unknown 10896 1726882189.63999: variable 'ansible_module_compression' from source: unknown 10896 1726882189.64008: variable 'ansible_shell_type' from source: unknown 10896 1726882189.64012: variable 'ansible_shell_executable' from source: unknown 10896 1726882189.64020: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882189.64023: variable 'ansible_pipelining' from source: unknown 10896 1726882189.64032: variable 'ansible_timeout' from source: unknown 10896 1726882189.64042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882189.64303: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882189.64317: variable 'omit' from source: magic vars 10896 1726882189.64507: starting attempt loop 10896 1726882189.64510: running the handler 10896 1726882189.64513: _low_level_execute_command(): starting 10896 1726882189.64515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882189.65445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.65502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.65571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.65606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.65626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.65716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.67321: stdout chunk (state=3): >>>/root <<< 10896 1726882189.67474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.67485: stdout chunk (state=3): >>><<< 10896 1726882189.67542: stderr chunk (state=3): >>><<< 10896 1726882189.67733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.67737: _low_level_execute_command(): starting 10896 1726882189.67752: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904 `" && echo ansible-tmp-1726882189.6764615-12423-162479796596904="` echo /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904 `" ) && sleep 0' 10896 1726882189.68317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882189.68410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.68423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.68437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.68449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.68535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.70384: stdout chunk (state=3): >>>ansible-tmp-1726882189.6764615-12423-162479796596904=/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904 <<< 10896 1726882189.70509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.70518: stderr chunk (state=3): >>><<< 10896 1726882189.70521: stdout chunk (state=3): >>><<< 10896 1726882189.70538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882189.6764615-12423-162479796596904=/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.70564: variable 'ansible_module_compression' from source: unknown 10896 1726882189.70607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882189.70638: variable 'ansible_facts' from source: unknown 10896 1726882189.70686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py 10896 1726882189.70781: Sending initial data 10896 1726882189.70784: Sent initial data (156 bytes) 10896 1726882189.71179: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.71217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882189.71221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882189.71226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.71228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882189.71231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.71281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882189.71305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.71381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.72914: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882189.73028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882189.73051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpo99x4q_d /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py <<< 10896 1726882189.73075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py" <<< 10896 1726882189.73150: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpo99x4q_d" to remote "/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py" <<< 10896 1726882189.73739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.73773: stderr chunk (state=3): >>><<< 10896 1726882189.73777: stdout chunk (state=3): >>><<< 10896 1726882189.73795: done transferring module to remote 10896 1726882189.73806: _low_level_execute_command(): starting 10896 1726882189.73811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/ /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py && sleep 0' 10896 1726882189.74219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.74222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.74225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.74227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.74273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.74276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.74343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.76068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.76092: stderr chunk (state=3): >>><<< 10896 1726882189.76098: stdout chunk (state=3): >>><<< 10896 1726882189.76113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.76116: _low_level_execute_command(): starting 10896 1726882189.76120: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/AnsiballZ_command.py && sleep 0' 10896 1726882189.76541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882189.76545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.76547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882189.76549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882189.76551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.76599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.76603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.76678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.92383: stdout chunk (state=3): >>> <<< 10896 1726882189.92404: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 21:29:49.915199", "end": "2024-09-20 21:29:49.922433", "delta": "0:00:00.007234", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882189.93800: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. <<< 10896 1726882189.93804: stderr chunk (state=3): >>><<< 10896 1726882189.93806: stdout chunk (state=3): >>><<< 10896 1726882189.93809: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 21:29:49.915199", "end": "2024-09-20 21:29:49.922433", "delta": "0:00:00.007234", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. 10896 1726882189.93817: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882189.93826: _low_level_execute_command(): starting 10896 1726882189.93831: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882189.6764615-12423-162479796596904/ > /dev/null 2>&1 && sleep 0' 10896 1726882189.94507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882189.94569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882189.94591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882189.94680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882189.96700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882189.96704: stdout chunk (state=3): >>><<< 10896 1726882189.96706: stderr chunk (state=3): >>><<< 10896 1726882189.96709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882189.96711: handler run complete 10896 1726882189.96713: Evaluated conditional (False): False 10896 1726882189.96715: Evaluated conditional (False): False 10896 1726882189.96717: attempt loop complete, returning result 10896 1726882189.96719: _execute() done 10896 1726882189.96721: dumping result to json 10896 1726882189.96722: done dumping result, returning 10896 1726882189.96724: done running TaskExecutor() for managed_node2/TASK: Delete the device 'deprecated-bond' [12673a56-9f93-8b02-b216-0000000000c2] 10896 1726882189.96726: sending task result for task 12673a56-9f93-8b02-b216-0000000000c2 ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.007234", "end": "2024-09-20 21:29:49.922433", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:29:49.915199" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 10896 1726882189.96857: no more pending results, returning what we have 10896 1726882189.96860: results queue empty 10896 1726882189.96861: checking for any_errors_fatal 10896 1726882189.96863: done checking for any_errors_fatal 10896 1726882189.96863: checking for max_fail_percentage 10896 1726882189.96865: done checking for max_fail_percentage 10896 1726882189.96866: checking to see if all hosts have failed and the running result is not ok 10896 1726882189.96867: done checking to see if all hosts have failed 10896 1726882189.96867: getting the remaining hosts for this loop 10896 1726882189.96869: done getting the remaining hosts for this loop 10896 1726882189.96871: getting the next task for host managed_node2 10896 1726882189.96879: done getting next task for host managed_node2 10896 1726882189.96881: ^ task is: TASK: Remove test interfaces 10896 1726882189.96885: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882189.96889: getting variables 10896 1726882189.96891: in VariableManager get_vars() 10896 1726882189.96933: Calling all_inventory to load vars for managed_node2 10896 1726882189.96936: Calling groups_inventory to load vars for managed_node2 10896 1726882189.96938: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882189.96949: Calling all_plugins_play to load vars for managed_node2 10896 1726882189.96952: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882189.96954: Calling groups_plugins_play to load vars for managed_node2 10896 1726882189.97628: done sending task result for task 12673a56-9f93-8b02-b216-0000000000c2 10896 1726882189.97631: WORKER PROCESS EXITING 10896 1726882189.98582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882190.00374: done with get_vars() 10896 1726882190.00399: done getting variables 10896 1726882190.00464: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:29:50 -0400 (0:00:00.388) 0:00:31.571 ****** 10896 1726882190.00501: entering _queue_task() for managed_node2/shell 10896 1726882190.01097: worker is 1 (out of 1 available) 10896 1726882190.01108: exiting _queue_task() for managed_node2/shell 10896 1726882190.01118: done queuing things up, now waiting for results queue to drain 10896 1726882190.01120: waiting for pending results... 10896 1726882190.01215: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 10896 1726882190.01469: in run() - task 12673a56-9f93-8b02-b216-0000000000c6 10896 1726882190.01473: variable 'ansible_search_path' from source: unknown 10896 1726882190.01476: variable 'ansible_search_path' from source: unknown 10896 1726882190.01479: calling self._execute() 10896 1726882190.01617: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.01644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.01686: variable 'omit' from source: magic vars 10896 1726882190.02104: variable 'ansible_distribution_major_version' from source: facts 10896 1726882190.02129: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882190.02198: variable 'omit' from source: magic vars 10896 1726882190.02202: variable 'omit' from source: magic vars 10896 1726882190.02469: variable 'dhcp_interface1' from source: play vars 10896 1726882190.02480: variable 'dhcp_interface2' from source: play vars 10896 1726882190.02509: variable 'omit' from source: magic vars 10896 1726882190.02564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882190.02609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882190.02635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882190.02662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882190.02700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882190.02716: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882190.02723: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.02729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.02831: Set connection var ansible_connection to ssh 10896 1726882190.02842: Set connection var ansible_timeout to 10 10896 1726882190.02848: Set connection var ansible_shell_type to sh 10896 1726882190.02859: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882190.02880: Set connection var ansible_shell_executable to /bin/sh 10896 1726882190.02882: Set connection var ansible_pipelining to False 10896 1726882190.02909: variable 'ansible_shell_executable' from source: unknown 10896 1726882190.02995: variable 'ansible_connection' from source: unknown 10896 1726882190.02998: variable 'ansible_module_compression' from source: unknown 10896 1726882190.03000: variable 'ansible_shell_type' from source: unknown 10896 1726882190.03003: variable 'ansible_shell_executable' from source: unknown 10896 1726882190.03004: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.03006: variable 'ansible_pipelining' from source: unknown 10896 1726882190.03008: variable 'ansible_timeout' from source: unknown 10896 1726882190.03011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.03107: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882190.03124: variable 'omit' from source: magic vars 10896 1726882190.03133: starting attempt loop 10896 1726882190.03139: running the handler 10896 1726882190.03151: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882190.03173: _low_level_execute_command(): starting 10896 1726882190.03185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882190.03984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882190.04118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.04148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.04162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.04254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.06024: stdout chunk (state=3): >>>/root <<< 10896 1726882190.06113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.06258: stderr chunk (state=3): >>><<< 10896 1726882190.06262: stdout chunk (state=3): >>><<< 10896 1726882190.06502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.06507: _low_level_execute_command(): starting 10896 1726882190.06510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084 `" && echo ansible-tmp-1726882190.0641863-12447-211911402314084="` echo /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084 `" ) && sleep 0' 10896 1726882190.07609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.07638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.07656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.07777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.10002: stdout chunk (state=3): >>>ansible-tmp-1726882190.0641863-12447-211911402314084=/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084 <<< 10896 1726882190.10006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.10009: stdout chunk (state=3): >>><<< 10896 1726882190.10012: stderr chunk (state=3): >>><<< 10896 1726882190.10015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882190.0641863-12447-211911402314084=/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.10017: variable 'ansible_module_compression' from source: unknown 10896 1726882190.10019: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882190.10022: variable 'ansible_facts' from source: unknown 10896 1726882190.10024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py 10896 1726882190.10239: Sending initial data 10896 1726882190.10244: Sent initial data (156 bytes) 10896 1726882190.10750: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882190.10753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.10755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.10903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.10908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.10910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.10913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.11017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.12534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882190.12595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882190.12648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmp_zmc2_d7 /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py <<< 10896 1726882190.12651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py" <<< 10896 1726882190.12826: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmp_zmc2_d7" to remote "/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py" <<< 10896 1726882190.14147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.14152: stdout chunk (state=3): >>><<< 10896 1726882190.14157: stderr chunk (state=3): >>><<< 10896 1726882190.14212: done transferring module to remote 10896 1726882190.14223: _low_level_execute_command(): starting 10896 1726882190.14229: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/ /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py && sleep 0' 10896 1726882190.15310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.15316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882190.15332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882190.15338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.15354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 10896 1726882190.15357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.15373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882190.15378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.15449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.15463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.15480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.15583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.17306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.17435: stderr chunk (state=3): >>><<< 10896 1726882190.17438: stdout chunk (state=3): >>><<< 10896 1726882190.17533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.17536: _low_level_execute_command(): starting 10896 1726882190.17539: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/AnsiballZ_command.py && sleep 0' 10896 1726882190.18752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882190.18755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.18758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.18834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882190.18838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882190.18840: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.18843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882190.18927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.19130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.19198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.38287: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:29:50.339234", "end": "2024-09-20 21:29:50.381465", "delta": "0:00:00.042231", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882190.39717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882190.39853: stderr chunk (state=3): >>><<< 10896 1726882190.39856: stdout chunk (state=3): >>><<< 10896 1726882190.39859: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:29:50.339234", "end": "2024-09-20 21:29:50.381465", "delta": "0:00:00.042231", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882190.39909: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882190.40126: _low_level_execute_command(): starting 10896 1726882190.40129: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882190.0641863-12447-211911402314084/ > /dev/null 2>&1 && sleep 0' 10896 1726882190.41500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.41515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.41531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.41549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.41728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.43536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.43581: stderr chunk (state=3): >>><<< 10896 1726882190.43812: stdout chunk (state=3): >>><<< 10896 1726882190.43830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.43837: handler run complete 10896 1726882190.43859: Evaluated conditional (False): False 10896 1726882190.43870: attempt loop complete, returning result 10896 1726882190.43873: _execute() done 10896 1726882190.43876: dumping result to json 10896 1726882190.43882: done dumping result, returning 10896 1726882190.43890: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [12673a56-9f93-8b02-b216-0000000000c6] 10896 1726882190.43962: sending task result for task 12673a56-9f93-8b02-b216-0000000000c6 10896 1726882190.44034: done sending task result for task 12673a56-9f93-8b02-b216-0000000000c6 10896 1726882190.44037: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.042231", "end": "2024-09-20 21:29:50.381465", "rc": 0, "start": "2024-09-20 21:29:50.339234" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 10896 1726882190.44132: no more pending results, returning what we have 10896 1726882190.44135: results queue empty 10896 1726882190.44135: checking for any_errors_fatal 10896 1726882190.44145: done checking for any_errors_fatal 10896 1726882190.44146: checking for max_fail_percentage 10896 1726882190.44147: done checking for max_fail_percentage 10896 1726882190.44148: checking to see if all hosts have failed and the running result is not ok 10896 1726882190.44149: done checking to see if all hosts have failed 10896 1726882190.44150: getting the remaining hosts for this loop 10896 1726882190.44151: done getting the remaining hosts for this loop 10896 1726882190.44154: getting the next task for host managed_node2 10896 1726882190.44161: done getting next task for host managed_node2 10896 1726882190.44164: ^ task is: TASK: Stop dnsmasq/radvd services 10896 1726882190.44168: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882190.44397: getting variables 10896 1726882190.44400: in VariableManager get_vars() 10896 1726882190.44438: Calling all_inventory to load vars for managed_node2 10896 1726882190.44441: Calling groups_inventory to load vars for managed_node2 10896 1726882190.44443: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882190.44455: Calling all_plugins_play to load vars for managed_node2 10896 1726882190.44458: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882190.44461: Calling groups_plugins_play to load vars for managed_node2 10896 1726882190.47341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882190.50754: done with get_vars() 10896 1726882190.50785: done getting variables 10896 1726882190.51049: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:29:50 -0400 (0:00:00.505) 0:00:32.077 ****** 10896 1726882190.51091: entering _queue_task() for managed_node2/shell 10896 1726882190.51879: worker is 1 (out of 1 available) 10896 1726882190.51892: exiting _queue_task() for managed_node2/shell 10896 1726882190.51907: done queuing things up, now waiting for results queue to drain 10896 1726882190.51908: waiting for pending results... 10896 1726882190.52513: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 10896 1726882190.52649: in run() - task 12673a56-9f93-8b02-b216-0000000000c7 10896 1726882190.52734: variable 'ansible_search_path' from source: unknown 10896 1726882190.52742: variable 'ansible_search_path' from source: unknown 10896 1726882190.52809: calling self._execute() 10896 1726882190.53152: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.53155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.53158: variable 'omit' from source: magic vars 10896 1726882190.53710: variable 'ansible_distribution_major_version' from source: facts 10896 1726882190.53729: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882190.53740: variable 'omit' from source: magic vars 10896 1726882190.53802: variable 'omit' from source: magic vars 10896 1726882190.53842: variable 'omit' from source: magic vars 10896 1726882190.53884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882190.53932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882190.53956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882190.53977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882190.53998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882190.54038: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882190.54046: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.54053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.54159: Set connection var ansible_connection to ssh 10896 1726882190.54171: Set connection var ansible_timeout to 10 10896 1726882190.54178: Set connection var ansible_shell_type to sh 10896 1726882190.54189: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882190.54203: Set connection var ansible_shell_executable to /bin/sh 10896 1726882190.54212: Set connection var ansible_pipelining to False 10896 1726882190.54242: variable 'ansible_shell_executable' from source: unknown 10896 1726882190.54251: variable 'ansible_connection' from source: unknown 10896 1726882190.54257: variable 'ansible_module_compression' from source: unknown 10896 1726882190.54262: variable 'ansible_shell_type' from source: unknown 10896 1726882190.54268: variable 'ansible_shell_executable' from source: unknown 10896 1726882190.54273: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.54342: variable 'ansible_pipelining' from source: unknown 10896 1726882190.54345: variable 'ansible_timeout' from source: unknown 10896 1726882190.54347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.54434: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882190.54457: variable 'omit' from source: magic vars 10896 1726882190.54467: starting attempt loop 10896 1726882190.54474: running the handler 10896 1726882190.54486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882190.54513: _low_level_execute_command(): starting 10896 1726882190.54525: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882190.55284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882190.55402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.55417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.55443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.55460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.55559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.57134: stdout chunk (state=3): >>>/root <<< 10896 1726882190.57279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.57299: stdout chunk (state=3): >>><<< 10896 1726882190.57320: stderr chunk (state=3): >>><<< 10896 1726882190.57339: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.57352: _low_level_execute_command(): starting 10896 1726882190.57358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200 `" && echo ansible-tmp-1726882190.573379-12469-162284056220200="` echo /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200 `" ) && sleep 0' 10896 1726882190.57762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.57771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.57792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.57832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.57835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.57900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.59850: stdout chunk (state=3): >>>ansible-tmp-1726882190.573379-12469-162284056220200=/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200 <<< 10896 1726882190.59935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.59939: stdout chunk (state=3): >>><<< 10896 1726882190.59941: stderr chunk (state=3): >>><<< 10896 1726882190.60101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882190.573379-12469-162284056220200=/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.60104: variable 'ansible_module_compression' from source: unknown 10896 1726882190.60107: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882190.60109: variable 'ansible_facts' from source: unknown 10896 1726882190.60175: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py 10896 1726882190.60325: Sending initial data 10896 1726882190.60344: Sent initial data (155 bytes) 10896 1726882190.60764: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.60778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882190.60810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.60862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.60873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.60926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.62619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882190.62696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpz83go013 /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py <<< 10896 1726882190.62700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py" <<< 10896 1726882190.62784: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpz83go013" to remote "/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py" <<< 10896 1726882190.63888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.64123: stderr chunk (state=3): >>><<< 10896 1726882190.64126: stdout chunk (state=3): >>><<< 10896 1726882190.64129: done transferring module to remote 10896 1726882190.64131: _low_level_execute_command(): starting 10896 1726882190.64133: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/ /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py && sleep 0' 10896 1726882190.64860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.64874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.64885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.64979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.66938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.66942: stdout chunk (state=3): >>><<< 10896 1726882190.66967: stderr chunk (state=3): >>><<< 10896 1726882190.66970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.66973: _low_level_execute_command(): starting 10896 1726882190.66975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/AnsiballZ_command.py && sleep 0' 10896 1726882190.67744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.67747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882190.67749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.67798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.67802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882190.67845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.67867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.67873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.67964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.85460: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:29:50.825993", "end": "2024-09-20 21:29:50.851740", "delta": "0:00:00.025747", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882190.86869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882190.86873: stdout chunk (state=3): >>><<< 10896 1726882190.86875: stderr chunk (state=3): >>><<< 10896 1726882190.87016: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:29:50.825993", "end": "2024-09-20 21:29:50.851740", "delta": "0:00:00.025747", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882190.87024: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882190.87027: _low_level_execute_command(): starting 10896 1726882190.87030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882190.573379-12469-162284056220200/ > /dev/null 2>&1 && sleep 0' 10896 1726882190.87547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882190.87560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882190.87576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882190.87601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882190.87700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882190.87721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882190.87740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882190.87945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882190.89772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882190.89775: stdout chunk (state=3): >>><<< 10896 1726882190.89789: stderr chunk (state=3): >>><<< 10896 1726882190.90103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882190.90107: handler run complete 10896 1726882190.90110: Evaluated conditional (False): False 10896 1726882190.90112: attempt loop complete, returning result 10896 1726882190.90115: _execute() done 10896 1726882190.90117: dumping result to json 10896 1726882190.90119: done dumping result, returning 10896 1726882190.90121: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [12673a56-9f93-8b02-b216-0000000000c7] 10896 1726882190.90123: sending task result for task 12673a56-9f93-8b02-b216-0000000000c7 10896 1726882190.90202: done sending task result for task 12673a56-9f93-8b02-b216-0000000000c7 ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.025747", "end": "2024-09-20 21:29:50.851740", "rc": 0, "start": "2024-09-20 21:29:50.825993" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 10896 1726882190.90278: no more pending results, returning what we have 10896 1726882190.90282: results queue empty 10896 1726882190.90283: checking for any_errors_fatal 10896 1726882190.90300: done checking for any_errors_fatal 10896 1726882190.90301: checking for max_fail_percentage 10896 1726882190.90303: done checking for max_fail_percentage 10896 1726882190.90304: checking to see if all hosts have failed and the running result is not ok 10896 1726882190.90306: done checking to see if all hosts have failed 10896 1726882190.90306: getting the remaining hosts for this loop 10896 1726882190.90308: done getting the remaining hosts for this loop 10896 1726882190.90496: getting the next task for host managed_node2 10896 1726882190.90507: done getting next task for host managed_node2 10896 1726882190.90510: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 10896 1726882190.90514: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882190.90519: getting variables 10896 1726882190.90521: in VariableManager get_vars() 10896 1726882190.90566: Calling all_inventory to load vars for managed_node2 10896 1726882190.90570: Calling groups_inventory to load vars for managed_node2 10896 1726882190.90573: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882190.90701: Calling all_plugins_play to load vars for managed_node2 10896 1726882190.90706: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882190.90715: Calling groups_plugins_play to load vars for managed_node2 10896 1726882190.91530: WORKER PROCESS EXITING 10896 1726882190.93061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882190.94680: done with get_vars() 10896 1726882190.94709: done getting variables 10896 1726882190.94766: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Friday 20 September 2024 21:29:50 -0400 (0:00:00.437) 0:00:32.514 ****** 10896 1726882190.94805: entering _queue_task() for managed_node2/command 10896 1726882190.95235: worker is 1 (out of 1 available) 10896 1726882190.95247: exiting _queue_task() for managed_node2/command 10896 1726882190.95260: done queuing things up, now waiting for results queue to drain 10896 1726882190.95262: waiting for pending results... 10896 1726882190.95495: running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript 10896 1726882190.95659: in run() - task 12673a56-9f93-8b02-b216-0000000000c8 10896 1726882190.95690: variable 'ansible_search_path' from source: unknown 10896 1726882190.95742: calling self._execute() 10896 1726882190.95884: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882190.95905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882190.95927: variable 'omit' from source: magic vars 10896 1726882190.96342: variable 'ansible_distribution_major_version' from source: facts 10896 1726882190.96370: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882190.96595: variable 'network_provider' from source: set_fact 10896 1726882190.96610: Evaluated conditional (network_provider == "initscripts"): False 10896 1726882190.96618: when evaluation is False, skipping this task 10896 1726882190.96625: _execute() done 10896 1726882190.96633: dumping result to json 10896 1726882190.96640: done dumping result, returning 10896 1726882190.96648: done running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript [12673a56-9f93-8b02-b216-0000000000c8] 10896 1726882190.96688: sending task result for task 12673a56-9f93-8b02-b216-0000000000c8 10896 1726882190.96759: done sending task result for task 12673a56-9f93-8b02-b216-0000000000c8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10896 1726882190.96845: no more pending results, returning what we have 10896 1726882190.96848: results queue empty 10896 1726882190.96849: checking for any_errors_fatal 10896 1726882190.96861: done checking for any_errors_fatal 10896 1726882190.96862: checking for max_fail_percentage 10896 1726882190.96863: done checking for max_fail_percentage 10896 1726882190.96864: checking to see if all hosts have failed and the running result is not ok 10896 1726882190.96865: done checking to see if all hosts have failed 10896 1726882190.96866: getting the remaining hosts for this loop 10896 1726882190.96867: done getting the remaining hosts for this loop 10896 1726882190.96870: getting the next task for host managed_node2 10896 1726882190.96878: done getting next task for host managed_node2 10896 1726882190.96880: ^ task is: TASK: Verify network state restored to default 10896 1726882190.96884: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882190.96889: getting variables 10896 1726882190.96890: in VariableManager get_vars() 10896 1726882190.96935: Calling all_inventory to load vars for managed_node2 10896 1726882190.96938: Calling groups_inventory to load vars for managed_node2 10896 1726882190.96941: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882190.96953: Calling all_plugins_play to load vars for managed_node2 10896 1726882190.96957: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882190.96960: Calling groups_plugins_play to load vars for managed_node2 10896 1726882190.97607: WORKER PROCESS EXITING 10896 1726882190.98509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882191.00100: done with get_vars() 10896 1726882191.00120: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Friday 20 September 2024 21:29:51 -0400 (0:00:00.054) 0:00:32.568 ****** 10896 1726882191.00205: entering _queue_task() for managed_node2/include_tasks 10896 1726882191.00710: worker is 1 (out of 1 available) 10896 1726882191.00720: exiting _queue_task() for managed_node2/include_tasks 10896 1726882191.00731: done queuing things up, now waiting for results queue to drain 10896 1726882191.00732: waiting for pending results... 10896 1726882191.00867: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 10896 1726882191.01010: in run() - task 12673a56-9f93-8b02-b216-0000000000c9 10896 1726882191.01033: variable 'ansible_search_path' from source: unknown 10896 1726882191.01083: calling self._execute() 10896 1726882191.01197: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.01212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.01228: variable 'omit' from source: magic vars 10896 1726882191.01674: variable 'ansible_distribution_major_version' from source: facts 10896 1726882191.01701: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882191.01725: _execute() done 10896 1726882191.01740: dumping result to json 10896 1726882191.01748: done dumping result, returning 10896 1726882191.01758: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [12673a56-9f93-8b02-b216-0000000000c9] 10896 1726882191.01779: sending task result for task 12673a56-9f93-8b02-b216-0000000000c9 10896 1726882191.01898: done sending task result for task 12673a56-9f93-8b02-b216-0000000000c9 10896 1726882191.01902: WORKER PROCESS EXITING 10896 1726882191.01957: no more pending results, returning what we have 10896 1726882191.01963: in VariableManager get_vars() 10896 1726882191.02021: Calling all_inventory to load vars for managed_node2 10896 1726882191.02025: Calling groups_inventory to load vars for managed_node2 10896 1726882191.02028: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882191.02042: Calling all_plugins_play to load vars for managed_node2 10896 1726882191.02046: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882191.02050: Calling groups_plugins_play to load vars for managed_node2 10896 1726882191.03535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882191.04375: done with get_vars() 10896 1726882191.04387: variable 'ansible_search_path' from source: unknown 10896 1726882191.04400: we have included files to process 10896 1726882191.04401: generating all_blocks data 10896 1726882191.04403: done generating all_blocks data 10896 1726882191.04406: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10896 1726882191.04407: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10896 1726882191.04408: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10896 1726882191.04672: done processing included file 10896 1726882191.04674: iterating over new_blocks loaded from include file 10896 1726882191.04676: in VariableManager get_vars() 10896 1726882191.04697: done with get_vars() 10896 1726882191.04699: filtering new block on tags 10896 1726882191.04722: done filtering new block on tags 10896 1726882191.04724: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 10896 1726882191.04728: extending task lists for all hosts with included blocks 10896 1726882191.05922: done extending task lists 10896 1726882191.05923: done processing included files 10896 1726882191.05923: results queue empty 10896 1726882191.05924: checking for any_errors_fatal 10896 1726882191.05926: done checking for any_errors_fatal 10896 1726882191.05926: checking for max_fail_percentage 10896 1726882191.05927: done checking for max_fail_percentage 10896 1726882191.05927: checking to see if all hosts have failed and the running result is not ok 10896 1726882191.05928: done checking to see if all hosts have failed 10896 1726882191.05928: getting the remaining hosts for this loop 10896 1726882191.05929: done getting the remaining hosts for this loop 10896 1726882191.05931: getting the next task for host managed_node2 10896 1726882191.05933: done getting next task for host managed_node2 10896 1726882191.05935: ^ task is: TASK: Check routes and DNS 10896 1726882191.05936: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882191.05938: getting variables 10896 1726882191.05938: in VariableManager get_vars() 10896 1726882191.05947: Calling all_inventory to load vars for managed_node2 10896 1726882191.05949: Calling groups_inventory to load vars for managed_node2 10896 1726882191.05950: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882191.05954: Calling all_plugins_play to load vars for managed_node2 10896 1726882191.05955: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882191.05957: Calling groups_plugins_play to load vars for managed_node2 10896 1726882191.06601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882191.07480: done with get_vars() 10896 1726882191.07497: done getting variables 10896 1726882191.07526: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:29:51 -0400 (0:00:00.073) 0:00:32.642 ****** 10896 1726882191.07546: entering _queue_task() for managed_node2/shell 10896 1726882191.07773: worker is 1 (out of 1 available) 10896 1726882191.07786: exiting _queue_task() for managed_node2/shell 10896 1726882191.07808: done queuing things up, now waiting for results queue to drain 10896 1726882191.07810: waiting for pending results... 10896 1726882191.08173: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 10896 1726882191.08250: in run() - task 12673a56-9f93-8b02-b216-000000000570 10896 1726882191.08254: variable 'ansible_search_path' from source: unknown 10896 1726882191.08257: variable 'ansible_search_path' from source: unknown 10896 1726882191.08260: calling self._execute() 10896 1726882191.08427: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.08435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.08439: variable 'omit' from source: magic vars 10896 1726882191.08800: variable 'ansible_distribution_major_version' from source: facts 10896 1726882191.08804: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882191.08806: variable 'omit' from source: magic vars 10896 1726882191.08818: variable 'omit' from source: magic vars 10896 1726882191.08856: variable 'omit' from source: magic vars 10896 1726882191.08902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882191.08940: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882191.08964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882191.08987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882191.09015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882191.09049: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882191.09058: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.09109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.09206: Set connection var ansible_connection to ssh 10896 1726882191.09221: Set connection var ansible_timeout to 10 10896 1726882191.09224: Set connection var ansible_shell_type to sh 10896 1726882191.09230: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882191.09235: Set connection var ansible_shell_executable to /bin/sh 10896 1726882191.09241: Set connection var ansible_pipelining to False 10896 1726882191.09259: variable 'ansible_shell_executable' from source: unknown 10896 1726882191.09262: variable 'ansible_connection' from source: unknown 10896 1726882191.09265: variable 'ansible_module_compression' from source: unknown 10896 1726882191.09267: variable 'ansible_shell_type' from source: unknown 10896 1726882191.09269: variable 'ansible_shell_executable' from source: unknown 10896 1726882191.09271: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.09273: variable 'ansible_pipelining' from source: unknown 10896 1726882191.09276: variable 'ansible_timeout' from source: unknown 10896 1726882191.09280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.09397: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882191.09404: variable 'omit' from source: magic vars 10896 1726882191.09417: starting attempt loop 10896 1726882191.09420: running the handler 10896 1726882191.09427: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882191.09441: _low_level_execute_command(): starting 10896 1726882191.09448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882191.09929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.09933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.09936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882191.09939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.09983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.09987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.10063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.11642: stdout chunk (state=3): >>>/root <<< 10896 1726882191.11789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.11792: stdout chunk (state=3): >>><<< 10896 1726882191.11797: stderr chunk (state=3): >>><<< 10896 1726882191.11818: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.11911: _low_level_execute_command(): starting 10896 1726882191.11915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922 `" && echo ansible-tmp-1726882191.1182826-12503-206166539527922="` echo /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922 `" ) && sleep 0' 10896 1726882191.12354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.12358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.12377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.12421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.12442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.12445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.12514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.14386: stdout chunk (state=3): >>>ansible-tmp-1726882191.1182826-12503-206166539527922=/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922 <<< 10896 1726882191.14466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.14491: stderr chunk (state=3): >>><<< 10896 1726882191.14499: stdout chunk (state=3): >>><<< 10896 1726882191.14512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882191.1182826-12503-206166539527922=/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.14536: variable 'ansible_module_compression' from source: unknown 10896 1726882191.14574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882191.14609: variable 'ansible_facts' from source: unknown 10896 1726882191.14661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py 10896 1726882191.14756: Sending initial data 10896 1726882191.14759: Sent initial data (156 bytes) 10896 1726882191.15166: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.15170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.15172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.15174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882191.15176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.15224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.15227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.15290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.16799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10896 1726882191.16807: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882191.16861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882191.16924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpf8u8c0g1 /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py <<< 10896 1726882191.16928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py" <<< 10896 1726882191.16983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpf8u8c0g1" to remote "/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py" <<< 10896 1726882191.17564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.17607: stderr chunk (state=3): >>><<< 10896 1726882191.17612: stdout chunk (state=3): >>><<< 10896 1726882191.17639: done transferring module to remote 10896 1726882191.17648: _low_level_execute_command(): starting 10896 1726882191.17652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/ /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py && sleep 0' 10896 1726882191.18070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.18079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.18099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.18111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.18161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.18166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.18170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.18229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.19961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.19987: stderr chunk (state=3): >>><<< 10896 1726882191.19990: stdout chunk (state=3): >>><<< 10896 1726882191.20011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.20014: _low_level_execute_command(): starting 10896 1726882191.20017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/AnsiballZ_command.py && sleep 0' 10896 1726882191.20459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.20462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882191.20464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.20466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 10896 1726882191.20468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.20517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.20524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.20526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.20590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.36417: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3480sec preferred_lft 3480sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:29:51.354403", "end": "2024-09-20 21:29:51.362525", "delta": "0:00:00.008122", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882191.37892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882191.37899: stdout chunk (state=3): >>><<< 10896 1726882191.37901: stderr chunk (state=3): >>><<< 10896 1726882191.38238: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3480sec preferred_lft 3480sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:29:51.354403", "end": "2024-09-20 21:29:51.362525", "delta": "0:00:00.008122", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882191.38247: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882191.38251: _low_level_execute_command(): starting 10896 1726882191.38253: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882191.1182826-12503-206166539527922/ > /dev/null 2>&1 && sleep 0' 10896 1726882191.39277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.39280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 10896 1726882191.39282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10896 1726882191.39284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.39287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.39459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.39612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.39689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.41476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.41505: stderr chunk (state=3): >>><<< 10896 1726882191.41526: stdout chunk (state=3): >>><<< 10896 1726882191.41577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.41638: handler run complete 10896 1726882191.41668: Evaluated conditional (False): False 10896 1726882191.41904: attempt loop complete, returning result 10896 1726882191.41907: _execute() done 10896 1726882191.41909: dumping result to json 10896 1726882191.41912: done dumping result, returning 10896 1726882191.41914: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [12673a56-9f93-8b02-b216-000000000570] 10896 1726882191.41916: sending task result for task 12673a56-9f93-8b02-b216-000000000570 10896 1726882191.41999: done sending task result for task 12673a56-9f93-8b02-b216-000000000570 10896 1726882191.42002: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008122", "end": "2024-09-20 21:29:51.362525", "rc": 0, "start": "2024-09-20 21:29:51.354403" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3480sec preferred_lft 3480sec inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 10896 1726882191.42107: no more pending results, returning what we have 10896 1726882191.42111: results queue empty 10896 1726882191.42112: checking for any_errors_fatal 10896 1726882191.42113: done checking for any_errors_fatal 10896 1726882191.42114: checking for max_fail_percentage 10896 1726882191.42116: done checking for max_fail_percentage 10896 1726882191.42117: checking to see if all hosts have failed and the running result is not ok 10896 1726882191.42118: done checking to see if all hosts have failed 10896 1726882191.42118: getting the remaining hosts for this loop 10896 1726882191.42120: done getting the remaining hosts for this loop 10896 1726882191.42124: getting the next task for host managed_node2 10896 1726882191.42131: done getting next task for host managed_node2 10896 1726882191.42134: ^ task is: TASK: Verify DNS and network connectivity 10896 1726882191.42138: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10896 1726882191.42149: getting variables 10896 1726882191.42151: in VariableManager get_vars() 10896 1726882191.42432: Calling all_inventory to load vars for managed_node2 10896 1726882191.42436: Calling groups_inventory to load vars for managed_node2 10896 1726882191.42439: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882191.42452: Calling all_plugins_play to load vars for managed_node2 10896 1726882191.42456: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882191.42459: Calling groups_plugins_play to load vars for managed_node2 10896 1726882191.45962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882191.49119: done with get_vars() 10896 1726882191.49149: done getting variables 10896 1726882191.49211: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:29:51 -0400 (0:00:00.416) 0:00:33.059 ****** 10896 1726882191.49243: entering _queue_task() for managed_node2/shell 10896 1726882191.49954: worker is 1 (out of 1 available) 10896 1726882191.49965: exiting _queue_task() for managed_node2/shell 10896 1726882191.49977: done queuing things up, now waiting for results queue to drain 10896 1726882191.49978: waiting for pending results... 10896 1726882191.50472: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 10896 1726882191.50802: in run() - task 12673a56-9f93-8b02-b216-000000000571 10896 1726882191.50806: variable 'ansible_search_path' from source: unknown 10896 1726882191.50809: variable 'ansible_search_path' from source: unknown 10896 1726882191.51201: calling self._execute() 10896 1726882191.51204: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.51207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.51209: variable 'omit' from source: magic vars 10896 1726882191.52000: variable 'ansible_distribution_major_version' from source: facts 10896 1726882191.52004: Evaluated conditional (ansible_distribution_major_version != '6'): True 10896 1726882191.52131: variable 'ansible_facts' from source: unknown 10896 1726882191.53746: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 10896 1726882191.53759: variable 'omit' from source: magic vars 10896 1726882191.53818: variable 'omit' from source: magic vars 10896 1726882191.53978: variable 'omit' from source: magic vars 10896 1726882191.54029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10896 1726882191.54069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10896 1726882191.54300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10896 1726882191.54303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882191.54306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10896 1726882191.54308: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10896 1726882191.54311: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.54313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.54434: Set connection var ansible_connection to ssh 10896 1726882191.54632: Set connection var ansible_timeout to 10 10896 1726882191.54649: Set connection var ansible_shell_type to sh 10896 1726882191.54901: Set connection var ansible_module_compression to ZIP_DEFLATED 10896 1726882191.54904: Set connection var ansible_shell_executable to /bin/sh 10896 1726882191.54906: Set connection var ansible_pipelining to False 10896 1726882191.54909: variable 'ansible_shell_executable' from source: unknown 10896 1726882191.54912: variable 'ansible_connection' from source: unknown 10896 1726882191.54914: variable 'ansible_module_compression' from source: unknown 10896 1726882191.54916: variable 'ansible_shell_type' from source: unknown 10896 1726882191.54918: variable 'ansible_shell_executable' from source: unknown 10896 1726882191.54920: variable 'ansible_host' from source: host vars for 'managed_node2' 10896 1726882191.54921: variable 'ansible_pipelining' from source: unknown 10896 1726882191.54923: variable 'ansible_timeout' from source: unknown 10896 1726882191.54926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10896 1726882191.55177: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882191.55196: variable 'omit' from source: magic vars 10896 1726882191.55207: starting attempt loop 10896 1726882191.55214: running the handler 10896 1726882191.55228: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10896 1726882191.55250: _low_level_execute_command(): starting 10896 1726882191.55261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10896 1726882191.56046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882191.56059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.56074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.56100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.56202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.56225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.56243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.56512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.57902: stdout chunk (state=3): >>>/root <<< 10896 1726882191.58057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.58060: stdout chunk (state=3): >>><<< 10896 1726882191.58063: stderr chunk (state=3): >>><<< 10896 1726882191.58180: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.58183: _low_level_execute_command(): starting 10896 1726882191.58186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790 `" && echo ansible-tmp-1726882191.5808313-12522-136263901028790="` echo /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790 `" ) && sleep 0' 10896 1726882191.59152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.59155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.59318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.59321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.59324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.59326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.59577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.61447: stdout chunk (state=3): >>>ansible-tmp-1726882191.5808313-12522-136263901028790=/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790 <<< 10896 1726882191.61579: stdout chunk (state=3): >>><<< 10896 1726882191.61583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.61588: stderr chunk (state=3): >>><<< 10896 1726882191.61612: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882191.5808313-12522-136263901028790=/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.61653: variable 'ansible_module_compression' from source: unknown 10896 1726882191.61707: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10896roiuymk0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10896 1726882191.61747: variable 'ansible_facts' from source: unknown 10896 1726882191.61833: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py 10896 1726882191.62000: Sending initial data 10896 1726882191.62004: Sent initial data (156 bytes) 10896 1726882191.62570: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882191.62701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.62705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.62707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882191.62710: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.62721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.62738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.62823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.64375: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10896 1726882191.64379: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10896 1726882191.64382: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10896 1726882191.64438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10896 1726882191.64515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10896roiuymk0/tmpcsl_mru8 /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py <<< 10896 1726882191.64518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py" <<< 10896 1726882191.64572: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10896roiuymk0/tmpcsl_mru8" to remote "/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py" <<< 10896 1726882191.65564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.65567: stdout chunk (state=3): >>><<< 10896 1726882191.65570: stderr chunk (state=3): >>><<< 10896 1726882191.65572: done transferring module to remote 10896 1726882191.65574: _low_level_execute_command(): starting 10896 1726882191.65577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/ /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py && sleep 0' 10896 1726882191.66451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882191.66465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.66481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.66513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.66530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882191.66563: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882191.66613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.66680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.66699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.66720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.66907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.68986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882191.68990: stdout chunk (state=3): >>><<< 10896 1726882191.68992: stderr chunk (state=3): >>><<< 10896 1726882191.68998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882191.69000: _low_level_execute_command(): starting 10896 1726882191.69003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/AnsiballZ_command.py && sleep 0' 10896 1726882191.70121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882191.70125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.70142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10896 1726882191.70157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10896 1726882191.70170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 10896 1726882191.70198: stderr chunk (state=3): >>>debug2: match not found <<< 10896 1726882191.70201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.70204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10896 1726882191.70286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.70556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.70607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882191.95990: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13724 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4684 0 --:--:-- --:--:-- --:--:-- 4619", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:29:51.853190", "end": "2024-09-20 21:29:51.958408", "delta": "0:00:00.105218", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10896 1726882191.97521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 10896 1726882191.97525: stdout chunk (state=3): >>><<< 10896 1726882191.97533: stderr chunk (state=3): >>><<< 10896 1726882191.97601: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13724 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4684 0 --:--:-- --:--:-- --:--:-- 4619", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:29:51.853190", "end": "2024-09-20 21:29:51.958408", "delta": "0:00:00.105218", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 10896 1726882191.97605: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10896 1726882191.97607: _low_level_execute_command(): starting 10896 1726882191.97610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882191.5808313-12522-136263901028790/ > /dev/null 2>&1 && sleep 0' 10896 1726882191.98227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10896 1726882191.98237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10896 1726882191.98316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10896 1726882191.98388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 10896 1726882191.98427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10896 1726882191.98430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10896 1726882191.98533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10896 1726882192.00338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10896 1726882192.00342: stdout chunk (state=3): >>><<< 10896 1726882192.00347: stderr chunk (state=3): >>><<< 10896 1726882192.00361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10896 1726882192.00369: handler run complete 10896 1726882192.00386: Evaluated conditional (False): False 10896 1726882192.00399: attempt loop complete, returning result 10896 1726882192.00402: _execute() done 10896 1726882192.00405: dumping result to json 10896 1726882192.00409: done dumping result, returning 10896 1726882192.00416: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [12673a56-9f93-8b02-b216-000000000571] 10896 1726882192.00422: sending task result for task 12673a56-9f93-8b02-b216-000000000571 10896 1726882192.00523: done sending task result for task 12673a56-9f93-8b02-b216-000000000571 10896 1726882192.00526: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.105218", "end": "2024-09-20 21:29:51.958408", "rc": 0, "start": "2024-09-20 21:29:51.853190" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 13724 0 --:--:-- --:--:-- --:--:-- 13863 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4684 0 --:--:-- --:--:-- --:--:-- 4619 10896 1726882192.00590: no more pending results, returning what we have 10896 1726882192.00597: results queue empty 10896 1726882192.00598: checking for any_errors_fatal 10896 1726882192.00610: done checking for any_errors_fatal 10896 1726882192.00611: checking for max_fail_percentage 10896 1726882192.00612: done checking for max_fail_percentage 10896 1726882192.00613: checking to see if all hosts have failed and the running result is not ok 10896 1726882192.00614: done checking to see if all hosts have failed 10896 1726882192.00615: getting the remaining hosts for this loop 10896 1726882192.00616: done getting the remaining hosts for this loop 10896 1726882192.00620: getting the next task for host managed_node2 10896 1726882192.00630: done getting next task for host managed_node2 10896 1726882192.00632: ^ task is: TASK: meta (flush_handlers) 10896 1726882192.00634: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882192.00638: getting variables 10896 1726882192.00640: in VariableManager get_vars() 10896 1726882192.00679: Calling all_inventory to load vars for managed_node2 10896 1726882192.00682: Calling groups_inventory to load vars for managed_node2 10896 1726882192.00684: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882192.00704: Calling all_plugins_play to load vars for managed_node2 10896 1726882192.00707: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882192.00711: Calling groups_plugins_play to load vars for managed_node2 10896 1726882192.01661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882192.03159: done with get_vars() 10896 1726882192.03190: done getting variables 10896 1726882192.03260: in VariableManager get_vars() 10896 1726882192.03276: Calling all_inventory to load vars for managed_node2 10896 1726882192.03278: Calling groups_inventory to load vars for managed_node2 10896 1726882192.03280: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882192.03296: Calling all_plugins_play to load vars for managed_node2 10896 1726882192.03299: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882192.03302: Calling groups_plugins_play to load vars for managed_node2 10896 1726882192.04225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882192.05186: done with get_vars() 10896 1726882192.05214: done queuing things up, now waiting for results queue to drain 10896 1726882192.05216: results queue empty 10896 1726882192.05217: checking for any_errors_fatal 10896 1726882192.05222: done checking for any_errors_fatal 10896 1726882192.05223: checking for max_fail_percentage 10896 1726882192.05224: done checking for max_fail_percentage 10896 1726882192.05225: checking to see if all hosts have failed and the running result is not ok 10896 1726882192.05226: done checking to see if all hosts have failed 10896 1726882192.05227: getting the remaining hosts for this loop 10896 1726882192.05228: done getting the remaining hosts for this loop 10896 1726882192.05231: getting the next task for host managed_node2 10896 1726882192.05234: done getting next task for host managed_node2 10896 1726882192.05236: ^ task is: TASK: meta (flush_handlers) 10896 1726882192.05237: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882192.05240: getting variables 10896 1726882192.05241: in VariableManager get_vars() 10896 1726882192.05255: Calling all_inventory to load vars for managed_node2 10896 1726882192.05257: Calling groups_inventory to load vars for managed_node2 10896 1726882192.05258: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882192.05263: Calling all_plugins_play to load vars for managed_node2 10896 1726882192.05265: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882192.05268: Calling groups_plugins_play to load vars for managed_node2 10896 1726882192.07035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882192.08848: done with get_vars() 10896 1726882192.08868: done getting variables 10896 1726882192.09129: in VariableManager get_vars() 10896 1726882192.09144: Calling all_inventory to load vars for managed_node2 10896 1726882192.09147: Calling groups_inventory to load vars for managed_node2 10896 1726882192.09149: Calling all_plugins_inventory to load vars for managed_node2 10896 1726882192.09158: Calling all_plugins_play to load vars for managed_node2 10896 1726882192.09161: Calling groups_plugins_inventory to load vars for managed_node2 10896 1726882192.09164: Calling groups_plugins_play to load vars for managed_node2 10896 1726882192.10752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10896 1726882192.12517: done with get_vars() 10896 1726882192.12539: done queuing things up, now waiting for results queue to drain 10896 1726882192.12542: results queue empty 10896 1726882192.12542: checking for any_errors_fatal 10896 1726882192.12544: done checking for any_errors_fatal 10896 1726882192.12544: checking for max_fail_percentage 10896 1726882192.12545: done checking for max_fail_percentage 10896 1726882192.12546: checking to see if all hosts have failed and the running result is not ok 10896 1726882192.12547: done checking to see if all hosts have failed 10896 1726882192.12548: getting the remaining hosts for this loop 10896 1726882192.12548: done getting the remaining hosts for this loop 10896 1726882192.12551: getting the next task for host managed_node2 10896 1726882192.12554: done getting next task for host managed_node2 10896 1726882192.12555: ^ task is: None 10896 1726882192.12556: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10896 1726882192.12558: done queuing things up, now waiting for results queue to drain 10896 1726882192.12558: results queue empty 10896 1726882192.12559: checking for any_errors_fatal 10896 1726882192.12560: done checking for any_errors_fatal 10896 1726882192.12560: checking for max_fail_percentage 10896 1726882192.12561: done checking for max_fail_percentage 10896 1726882192.12562: checking to see if all hosts have failed and the running result is not ok 10896 1726882192.12563: done checking to see if all hosts have failed 10896 1726882192.12565: getting the next task for host managed_node2 10896 1726882192.12567: done getting next task for host managed_node2 10896 1726882192.12568: ^ task is: None 10896 1726882192.12569: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 21:29:52 -0400 (0:00:00.634) 0:00:33.693 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.29s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Create test interfaces -------------------------------------------------- 1.85s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.59s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.25s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.23s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.67s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install pgrep, sysctl --------------------------------------------------- 0.67s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Verify DNS and network connectivity ------------------------------------- 0.63s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.63s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.61s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Remove test interfaces -------------------------------------------------- 0.51s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.45s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Stop dnsmasq/radvd services --------------------------------------------- 0.44s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 10896 1726882192.12680: RUNNING CLEANUP